Storing odbc connections centrally - r

We have several databases that we access, and different scripts require one or more of these odbcconnections. I wanted to know if instead of putting an odbcConnect line in every script if there was a method to store all the connections centrally and import them as needed? That way if the database info changes I can update one file instead of every script.

You could use options in your .Rprofile file to save the connections. In your scripts, you would then use getOption.
There is also a Rprofile.site file which might be a better choice if you are working in a team with several R installations.
See here or the R installation and administration handbook for more information.

What I ended up doing was creating a package with each connection as a function.

Related

AzerothCore : Import the update of database

Hello I wanted to ask if, to import the .sql update (after a git pull) I have to assemble and merge with the bash file (app/db_assembler) or if it's ok if I just launch the worldserver.exe and he will do it
Thanks
Short answer
No, the worldserver process will NOT update your database.
You need to use the DB-assembler bash script, as the instructions say.
More details
This is different than in TrinityCore, where it is a feature of the worldserver process to update the database.
In AzerothCore this task is a responsability of an external script, written in bash, the DB-assembler.
The advantage of having an external script to do this task instead of the worldserver is:
You don't need to compile and run the worldserver if you only need to create the database (useful when using or developing tools that only need the DBs)
The DB assembler is able to generate a unique SQL update file per each DB (by merging all the single SQL update files), which can be useful for debugging or development purposes
In general, it is better to delegate different software components for different tasks, instead of having a monolith doing everything
You can also make your own merge script and apply manually. Or just merge with the db_assembler.sh then apply manually.
Else refer to Francesco's answer

Drupal to Drupal Migration Across Servers

I am in the process of migrating a D7 site from one server to another. I have successfully exported and uploaded the settings to the new site using Features, but I need to get the content over to the site as well. I've been looking at several modules to try and solve this problem, but I have not found anything suitable for this task. Please let me know if I am overlooking a really simple solution.
Thanks!
Mark
Easiest solution is to export a database dump and import it into your new server. You can do it wotj phpMyAdmin but I recommend using Drush.
This way you can simply do a database dump via:
drush sql-dump > ~/sql-dump-file-name.sql
and later import via:
drush sql-cli < ~/sql-dump-file-name.sql
Also copy your files directory from old server to new server which is located in /sites/default/files.
I've successfully used the backup and migrate module for these tasks. True, creating a dump and then spooling the dump into the other database works, but this typically also copies all caches.
The backup_migrate module allows you save backups on your local server, but also to your hard disk, from where you can upload it again to the other site.
A neat thing here is that you can exclude tables, such as cache tables, which makes the transfer much faster.
Obviously you need a core installation on the other end, and the backup_migrate module already installed for this to work, but I assume that since you only ask about the db, you must have mirrored the file structure already (excluding the settings files).

Using R to save images & .csv's, can I use R to upload them to website (use filezilla to do it manually)?

First I should say that a lot of this is over my head, so I apologize in advance for using incorrect terminology and potentially asking an unclear question. I'm doing my best.
Also, I saw ThisPost; is RCurl the tool I want to use for this task?
Every day for 4 months I'll be analyzing new data, and generating .csv files and .png's that need to be uploaded to a web site so that other team members will be checking. I've (nearly) automated all of the data collecting, data downloading, analysis, and file saving. The analysis is carried out in R, and R saves the files. Currently I use Filezilla to manually upload the new files to the website. Is there a way to use R to upload the files to the web site, so that I don't have to open Filezilla and drag+drop files?
It'd be nice to run my R-code and walk away, knowing that once it finishes running, the newly saved files will be automatically be put on the website.
Thanks for any help!
You didn't specify which protocol you use to upload your files using FileZilla. I assume it is ftp. If so, you can use the ftpUpload function of RCurl:
library(RCurl)
ftpUpload("yourfile", "ftp://ftp.yourserver.foo/yourfile",
userpwd="username:passwd")
RCurl also had methods for scp and should also support sftp using ftpUpload.

Gather data from drupal and export to CSV on schedule

We have a drupal site and we wish to export data from this site in the form of several CSV files. I'm aware of the Views module addins that make this a very simple process on demand, but what we're looking for is a way to automate this process through cron.
Most likely, we'll end up having to either write a standalone PHP file we can then access with cron to complete this action, or a custom module.
I first wanted to check to ensure that there isn't already a module or set of modules out there that will do what we're looking for. How would you accomplish this issue?
The end result is that these csv files will reside on the server for other services to pick up and import into their own systems or be distributed with rsync or something similar.
Best practices suggestions would also be appreciated!
if you want to do with cron,
Set up views with cvs data in them
Then add wget <path to your cvs view> or the path of a script which does everything you need, in your crontab.

need help in choosing the right tool

I have a client who has set-up a testing environment in some AI language. It basically runs some predefined test cases and stores the results in as log files (comma separated txt files). My job is to identify and suggest a reporting system and I have these options in mind. either
1. Importing the logs into MSSQL and use the reporting(SSRS) it uses
2. or us import the logs to MySQL and use PHP to develop custom reporting.
I am thinking that going with option2 is better. The reason for this is, the logs are inconsistent and contain unexpected wild characters that normally DB's don't accept. So, I can write some scripts in php before loading them to the database.
Can anyone please suggest if this is your problem what will you suggest to do?
It depends how fancy you need to be. If the data is in CSV files, you could even go so simple as to load it into Excel (or their favorite spreadsheet tool), and use spreadsheet macros to analyze it.

Resources