R dataset connection to tableau - r

Recently tableau gave the functionality of R connection in their release 8.1. I want to know if there is any way i can call an entire table created in R to tableau. Or an .rds object which contains the dataset into Tableau?

There is a tutorial on the Tableau website for this and a blog on r-bloggers which discuss. The tutorial has a number of comments and one of them (in early Dec I think) asks how to get an rds file in. You need to start Rserve and then execute a script on it to get your data.
Sorry I can't be more help as I only looked into it briefly and put it on the back-burner but if you get stuck they seem to come back quickly if you post a comment on the page:
http://www.tableausoftware.com/about/blog/2013/10/tableau-81-and-r-25327

Just pointing out that the Tableau Data Extract API might be useful here, even if the current version of R integration doesn't yet meet your needs. (Note, that link is to the version 8.1 docs released in late 2013 - so look for the latest version to see what functionality they've added since)
If what you want to do is to manipulate data in R and then send a table of data to Tableau for visualization, you could first try the simple step of exporting the data from R as a CSV file and then visualizing that data in Tableau. I know that's not sexy, but its always good to make sure you've got a way to get the output result you need before investing time in optimizing the process.
If that gets the effect you want, but you just want to automate more of the steps, then take a look at the Tableau Data Extract API. You could use that library to generate a Tableau Data Extract instead of a CSV file. If you have something in production that needs updates, then you could presumably create a python script or JVM program to read your RDS file periodically and generate a revised extract.

Let us assume your data.frame/ tibble etc (say dataset object) is ready in R/ RStudio and you want to connect it with Tableau
1. In RStudio (or R terminal), execute the following steps:
install.packages("Rserve")
library(Rserve)
Rserve() ##This gets the R connection service up and running
2. Now go to Tableau (I am using 10.3.2):
Help > Settings and Performances > Manage External Service Connection
Enter localhost in the Server field and click on Test Connection.
You have now established a connection between R and Tableau.
3. Come back to RStudio. Now we need a .rdatafile that will consist of our R object(s). In this case, dataset. This is the R object that we want to use in Tableau. Enter this in the R console:
save(dataset, file="objectName.rdata")
4. Switch to Tableau now.
Connect To a File > Statistical File
Go to your working directory where the newly created objectName.rdata resides. From the drop down list of file type, select R files (*.rdata, *.rda) and select your object. This will open the object you created in R in Tableau. Alternatively, you can drag and drop your object directly to Tableau's workspace.

Related

How to connect R to SQL Azure?

I have two databases in Azure and each has 5 tables. I can perform data wrangling inside Azure using Kusto but I would rather prefer using RStudio. I wish to connect R with Azure such that I can run a script in R and return results without importing the actual datasets.
Please help, where do I start? I have zero knowledge of such connections.
Thank you in advance.
Assuming you have already installed R and RStudio. Please follow below steps:
Open ODBC Data Source through Start Window and Add a User Data Source under 'User DSN' as below. Follow through the next buttons until finish and test the connections.
Go to RStudio and create new connection, you should see the Data Source added in above step. You should see the connection and table listed under Azure Sql Database that you connected with.
Run the query like below in Console:
dbGetQuery(con, "SELECT * from dbo.xxxx")
You should be able to see the result set accordingly. You can play with queries in the way you want.

How to change working directory/read from local csv in an R script used as a data source in Power BI?

I am trying to use an R script as a data source for Power BI. I am a regular user of R but am new to Power BI. When all the datasets that are imported by the R script are from SQL databases I can import the resulting dataframes from the R script fine, however I have a script that uses a .csv file that Power BI's R session can't find which results in the error:
Error: 'times_of_day_grid.csv' does not exist in current working directory ('C:/Users/MyUserName/RScriptWrapper_ac2d4ec7-a4f6-4977-8713-10494f4b0c4f').
The .pbix file and the R script are both stored in the same folder as the csv
I have tried manually setting the wd by inserting into the script
setwd("C:/Users/MyUserName/Documents/R/Projects/This Project Folder")
But this just results in the message
"Connecting - Please wait while we establish a connection to R"
And later if I leave it running:
Unable to connect
We encountered an error while trying to connect.
Details: "ADO.NET: R execution timeout. The script execution was
terminated, since it was running for more than 1800000 miliseconds."
I have also tried specifying the full addresses of the csv files in read_csv(), but this results in the same timeout warning.
Any ideas as to how I can edit my script (or the settings in Power BI) to get around this? (The script only takes a minute or so to run in RStudio.)
Don't forget that you can load your csv file using the built-in functionalities in PowerBI Get Data > Text/CSV and then go to Edit Queries and handle the R scripting from there. That way you won't have to worry about setting the working directory in the R script at all.
You can even load multiple files and work on each and everyone of them using the approach described in Operations on multiple tables / datasets with Edit Queries and R in Power BI
Please let me know how this works out for you-

Execute R script from SSIS Package

I wanted to execute R code from SSIS package. How can I add a data control step that executes R-code? SSIS supports only vb.net and asp.net.
SSIS has many data transformations available but R is very friendly when it comes to data manipulations.
I want to run a R-code from SSIS scripts or some other way.Basically, I'm trying to integrate R in ETL process.
I wanted to extract data(E) from from a CSV file.
Transform (T) it in R and load (L) it in Microsoft database.
Is it possible to get this workflow done in SSIS package by executing R-script using SSIS data control items? Thanks!
Here are a couple of ways you could integrate R into your ETL process.
Crude, fast and dirty - Execute Process Task in the Control Flow. This would be similar to calling RScript from the command line. You would likely make your transformation, save it to a file on disk, and get that filename from your Execute Process Task so you can feed it into a Data Flow task. Upside is you're keeping your R clean and separate from your C#/VB.
Integrated via Rdotnet - You could use the RDotNet library (I believe, haven't tried to integrate it). You would need to register the DLLs in the GAC, and then you can either work with .NET objects in your SSIS scripts or call R scripts directly.
Integrated in SQL Server 2016 - Microsoft has added R support via extended stored procedures. You call the R script via stored proc and use a sql query for input data and can store the output. See more detail here. This would mean utilizing an Execute SQL task in SSIS.
I hope it helps you or someone else, since you want data processing you might bring your dataset into a CSV file (throught a data flow task), execute the file using: "Rscript " (it might be executed as a command with the execute process task), inside the file you have to upload the dataset into a dataframe ( calling it with readLines() function), then do all the math/Calculation you request, write the data or calculation results into a CSV file an reading again it from SSIS.
It is not an elegant solution, but it works :), At least till microsoft integrates R as a control/data flow process.
CYA
PS. here you go how to execute files from the command line: Run R script from command line

R integration with Tableau

I am facing difficulty in integrating R with Tableau.
Initially when I created calculated field it was asking for Rserve package in R and was not alowing to drag field to worksheet. I have installed this package but still it shows error saying
"Error occurred while communicating with the Resrve service.Tableau i unable to connect to the service.Verify that server is running and that you have access privileges"
Any inputs. Thank you
You need to start Rserve. If you successfully install Rserve package, simply run this (on RGui, RStudio or wherever you run R scripts)
> library(Rserve)
> Rserve()
You can test your connection to RServe on Tableau, on Help, Settings and Performance, Manage R Connection.
As of Tableau 9, you can use *.rdata files with Tableau. Tableau 9 will read the first item stored in the *.rdata file. Just open an *.rdata file under "Statistical Files" in the Tableau intro screen.
To do this do:
save(myDataframe, "Myfile.rdata")
This will save the file with the dataframe stored in it. You can save as many items as you want, but Tableau will only read the first. It can read vectors and variables as well if they are in the first item. Note that rdata files compress data quite a bit. I recently compressed 900mb to 25mb. However Tableau will still need to decompress it to use it so be careful about memory.

Reading LabVIEW TDMS files with R

As part of a transition from MATLAB to R, I am trying to figure out how to read TDMS files created with National Instruments LabVIEW using R. TDMS is a fairly complex binary file format (http://www.ni.com/white-paper/5696/en/).
Add-ons exist for excel and open-office (http://www.ni.com/white-paper/3727/en/), and I could make something in LabVIEW to make the conversion, but I am looking for a solution that would let me read the TDMS files directly into R. This would allow us to test out the use of R for certain data processing requirements without changing what we do earlier in the data acquisition process. Having a simple process would also reduce the barriers to others trying out R for this purpose.
Does anyone have any experience with reading TDMS files directly into R, that they could share?
This is far from supporting all TDMS specifications but I started a port of a python npTDMS package into R here https://github.com/msuefishlab/tdmsreader and it has been tested out in the context of a shiny app here
You don't say if you need to automate the reading of these files using R, or just convert the data manually. I'm assuming you or your colleagues don't have any access to LabVIEW yourselves otherwise you could just create a LabVIEW tool to do the conversion (and build it as a standalone application or DLL, if you have the professional development system or app builder - you could run the built app from your R code by passing parameters on a command line).
The document on your first link refers to (a) add-ins for OpenOffice Calc and for Excel, which should work for a manual conversion and which you might be able to automate using those programs' respective macro languages, and (b) a C DLL for reading TDMS - would it be possible for you to use one of those?

Resources