How to run the entire code in a .sql file using Rstudio? - r

I have a dynamic code in a .sql file implementing which is kinda hard using RSQLite sqldf.
Can we do something that runs the entire code in SQL Server itself ? The only thing that happens through R is the activation of the code and the transfer of the first major dataset that into the SQL Server.
I have seen
sqlQuery()
and
sqldf()
But have not found any way to actually run the entire .sql file in SQL Server itself
Thanks

Related

Bulk upload using csv file to remote MonetDB server

I have a remote MonetDB server running and I want to bulk upload a csv file as it is much faster.
Based on the params in MonetDB.R, there is a csvdump=TRUE option but I don't think it works when you are trying to do this against a remote server. The server has to be local.
https://rdrr.io/github/MonetDB/monetdb-r/man/dbWriteTable.html
First, am I correct that I can't do this and if not, is there a workaround? I have a dataframe with +5M rows so it takes a long time with insert statements rather than using COPY INTO.
When I try using csvdump=TRUE against the remote server, it can't find the csv file because it is local to computer that called the dbWriteTable command.
I think you are right. As a workaround either use explicit COPY INTO ON CLIENT SQL statements or first use some file transfer tool to copy the file to the remote server before calling dbWriteTable.
It reads from MonetDB's documentation on COPY INTO:
FROM files ON SERVER
With ON SERVER, which is the default, the file name must be an
absolute path on the system on which the database server (mserver5) is
running. ...
Interestingly enough pymonetdb, the Python driver for MonetDB, uses ON CLIENT for bulk loads. From the pymonetdb's doc:
File Uploads and Downloads
Classes related to file transfer requests as used by COPY INTO ON
CLIENT.
You might want to file an issue for the MonetDB R-driver project to have similar behavior as pymonetdb.

Pentaho, R Executor insert into DB

It is possible to run a R script with Pentaho, but instead of export the result as a csv file, insert the result directly into a table on a DB?
Using the Community Edition of Pentaho, you could use a script executor step to execute a shell script in your OS to do all the work, including inserting to the database, which is not much Pentaho related, all the work is done by the shell script and you just use Pentaho to call the execution of that script.
There's also a very old plugin available in Github that I don't know if it would work with modern versions of Pentaho and R, to execute R code within Pentaho and then continue the stream of data to "normal" steps like the table output to insert the data to a table.
These are the details to configure that plugin from the developers:
http://dekarlab.de/wp/?p=5

Compile PL/SQL stored procedures in SQL developer from external files

I have a requirement where I have around 50 stored procedures saved in 50 different files with .pls extension in a windows directory. I have SQL Developer installed in the same machine. I want to compile all these stored procedures in SQL developer. Note - I don't want to execute, only want to compile. Please suggest a solution for this.
I have tried this but it didn't work.
Created a compile.sql file with contents below.
##"U:\Stored_Procedures\PRC_LOAD_TBL.pls";
exit;
Created a compile.bat file with contents below.
sqlplus -s -l <username>/<pswd>#<servicename>
#"U:\Stored_Procedures\Compile.sql" ;
Tried to run compile.bat batch file but it didn't work.
Also, I tried to run these from SQL Developer directly that didn't work either.
"Tried to run compile.bat batch file but it didn't work."
You tried to run a DOS batch file in SQL Developer? I think you're over-engineering this.
All you need to do is open your first script, compile.sql in SQL Developer and then click Run script (or press F5 function key).

Execute R script from SSIS Package

I wanted to execute R code from SSIS package. How can I add a data control step that executes R-code? SSIS supports only vb.net and asp.net.
SSIS has many data transformations available but R is very friendly when it comes to data manipulations.
I want to run a R-code from SSIS scripts or some other way.Basically, I'm trying to integrate R in ETL process.
I wanted to extract data(E) from from a CSV file.
Transform (T) it in R and load (L) it in Microsoft database.
Is it possible to get this workflow done in SSIS package by executing R-script using SSIS data control items? Thanks!
Here are a couple of ways you could integrate R into your ETL process.
Crude, fast and dirty - Execute Process Task in the Control Flow. This would be similar to calling RScript from the command line. You would likely make your transformation, save it to a file on disk, and get that filename from your Execute Process Task so you can feed it into a Data Flow task. Upside is you're keeping your R clean and separate from your C#/VB.
Integrated via Rdotnet - You could use the RDotNet library (I believe, haven't tried to integrate it). You would need to register the DLLs in the GAC, and then you can either work with .NET objects in your SSIS scripts or call R scripts directly.
Integrated in SQL Server 2016 - Microsoft has added R support via extended stored procedures. You call the R script via stored proc and use a sql query for input data and can store the output. See more detail here. This would mean utilizing an Execute SQL task in SSIS.
I hope it helps you or someone else, since you want data processing you might bring your dataset into a CSV file (throught a data flow task), execute the file using: "Rscript " (it might be executed as a command with the execute process task), inside the file you have to upload the dataset into a dataframe ( calling it with readLines() function), then do all the math/Calculation you request, write the data or calculation results into a CSV file an reading again it from SSIS.
It is not an elegant solution, but it works :), At least till microsoft integrates R as a control/data flow process.
CYA
PS. here you go how to execute files from the command line: Run R script from command line

How to update MYSQL Database with data from Excel using ASP.NET?

I have an excel file which gets updated every 10 seconds through an automated process. I need excel data to be updated in MY-SQL database which is located on a remote server.
How do I do that?
I have thought of following option:
1) Every 11 seconds, an Excel macro will run and will "Save as" excel as CSV file. (not sure whether this can be done by macro...just thinking)
2) This CSV file we will FTP to remote server using Windows Service.
3) On remote server, we will parse the csv file and Update MYSQL database.
Is this approach fine? Or do you have a better approach which requires less time to update the database?
Thanks!
I found following links to be more useful:
http://www.heritage-tech.net/908/inserting-data-into-mysql-from-excel-using-vba/
http://vbaexcel.eu/vba-macro-code/update-mysql-database-php
I hope this helps someone having similar problem as mine.
You can connect to the Excel spreadsheet using ODBC connection, read the data, and post it to the MySQL database, maybe through some sort of web service access, or via a saved CSV file?

Resources