Can we create multiple sessions in R language . I am not using Rstudio or shiny app . I am just integrating R and python using pip module pypeR. I cannot see any R code where i can learn to create session. My requirement is to create a new session for every R request from python.
Because whenever there are two requests to R at the same time my server gets blocked.
Also i have used "pypeR" module to integrate R with python.
import pyper
r=pyper.R(RCMD="/usr/bin/R")
path2script = path of my script file
output = r.run("source('"+path2script+"')")
I am a newbie in R as i have worked more in python.I have searched a lot about this issue and it seems to me that R works only in single session when called as a script through python.
Any help would be appreciated.
Related
I am trying to use an R script as a data source for Power BI. I am a regular user of R but am new to Power BI. When all the datasets that are imported by the R script are from SQL databases I can import the resulting dataframes from the R script fine, however I have a script that uses a .csv file that Power BI's R session can't find which results in the error:
Error: 'times_of_day_grid.csv' does not exist in current working directory ('C:/Users/MyUserName/RScriptWrapper_ac2d4ec7-a4f6-4977-8713-10494f4b0c4f').
The .pbix file and the R script are both stored in the same folder as the csv
I have tried manually setting the wd by inserting into the script
setwd("C:/Users/MyUserName/Documents/R/Projects/This Project Folder")
But this just results in the message
"Connecting - Please wait while we establish a connection to R"
And later if I leave it running:
Unable to connect
We encountered an error while trying to connect.
Details: "ADO.NET: R execution timeout. The script execution was
terminated, since it was running for more than 1800000 miliseconds."
I have also tried specifying the full addresses of the csv files in read_csv(), but this results in the same timeout warning.
Any ideas as to how I can edit my script (or the settings in Power BI) to get around this? (The script only takes a minute or so to run in RStudio.)
Don't forget that you can load your csv file using the built-in functionalities in PowerBI Get Data > Text/CSV and then go to Edit Queries and handle the R scripting from there. That way you won't have to worry about setting the working directory in the R script at all.
You can even load multiple files and work on each and everyone of them using the approach described in Operations on multiple tables / datasets with Edit Queries and R in Power BI
Please let me know how this works out for you-
I have an R-script that does stuff with a bunch of tweets and I would like to use the same script on the same data but saved in an Hadoop file system. According to this Hortonworks tutorial I could use R code with data from my HDFS, but it is not quite clear.
Can I use the very same R-script, taking advantage of the mapreduce paradigm, by using this Revolution R? Should I change my code or is there a way to execute the same functions optimized for an Hadoop architecture?
My wish would be to write my code on a standard R IDE like R-Studio and then use it, or use the most of it, on my cloud services (such as Microsoft Azure) with mapreduce on the base.
Yes, you can run any R script across different data platform from Hadoop to Spark to Teradata and SQL Server by using environment specific compute context.
Following two links should help you get started on how to use Revolution R / Microsoft R Server on Hadoop:
https://msdn.microsoft.com/en-us/microsoft-r/scaler-hadoop-getting-started
https://github.com/Azure/Azure-MachineLearning-DataScience/blob/master/Misc/MicrosoftR/Samples/NYCTaxi/NYC2013_MRS_LinearBinary.Rmd
Using Shiny and R to create a little webapp that pulls data from Google BigQuery and spits it out onto the page. Uses the bigrquery package
When running the script from inside R (source(x.R)) everything runs fine, however when using Rscript x.R I get the error. I'm trying to setup cron to run the script automatically.
There is a .httr-oauth file in the directory of the script.
This question is similar to another one I answered where I suggested using a Google Service Account for server-to-server authentication using the googleAuthR and bigQueryR packages for R. Please refer to that answer (via the above link) for details including an example R script.
In the end I've decided to just use python to pull data from bigquery and use it in R.
I am new to opencpu and I am trying this :
Create a script to load dataframe in memory (on server of course)
Give a method to query this dataframe through GET api.
Can this be done for a large dataframe to keep it loaded once? And can this be done without writing a R package (which is the only way I have found so far to access R scripts through opencpu)
Recently tableau gave the functionality of R connection in their release 8.1. I want to know if there is any way i can call an entire table created in R to tableau. Or an .rds object which contains the dataset into Tableau?
There is a tutorial on the Tableau website for this and a blog on r-bloggers which discuss. The tutorial has a number of comments and one of them (in early Dec I think) asks how to get an rds file in. You need to start Rserve and then execute a script on it to get your data.
Sorry I can't be more help as I only looked into it briefly and put it on the back-burner but if you get stuck they seem to come back quickly if you post a comment on the page:
http://www.tableausoftware.com/about/blog/2013/10/tableau-81-and-r-25327
Just pointing out that the Tableau Data Extract API might be useful here, even if the current version of R integration doesn't yet meet your needs. (Note, that link is to the version 8.1 docs released in late 2013 - so look for the latest version to see what functionality they've added since)
If what you want to do is to manipulate data in R and then send a table of data to Tableau for visualization, you could first try the simple step of exporting the data from R as a CSV file and then visualizing that data in Tableau. I know that's not sexy, but its always good to make sure you've got a way to get the output result you need before investing time in optimizing the process.
If that gets the effect you want, but you just want to automate more of the steps, then take a look at the Tableau Data Extract API. You could use that library to generate a Tableau Data Extract instead of a CSV file. If you have something in production that needs updates, then you could presumably create a python script or JVM program to read your RDS file periodically and generate a revised extract.
Let us assume your data.frame/ tibble etc (say dataset object) is ready in R/ RStudio and you want to connect it with Tableau
1. In RStudio (or R terminal), execute the following steps:
install.packages("Rserve")
library(Rserve)
Rserve() ##This gets the R connection service up and running
2. Now go to Tableau (I am using 10.3.2):
Help > Settings and Performances > Manage External Service Connection
Enter localhost in the Server field and click on Test Connection.
You have now established a connection between R and Tableau.
3. Come back to RStudio. Now we need a .rdatafile that will consist of our R object(s). In this case, dataset. This is the R object that we want to use in Tableau. Enter this in the R console:
save(dataset, file="objectName.rdata")
4. Switch to Tableau now.
Connect To a File > Statistical File
Go to your working directory where the newly created objectName.rdata resides. From the drop down list of file type, select R files (*.rdata, *.rda) and select your object. This will open the object you created in R in Tableau. Alternatively, you can drag and drop your object directly to Tableau's workspace.