Accessing MonetDBLite outside R - r

I created a big database using MonetDBLite in R and now I want to access the database outside R (e.g., with a general database GUI).
Is there a way to do this? I do not want to replicate the data as I still want to access it through R. I do not need to access the database concurrently from R and a SQL GUI but I want to switch between the two as needed.

Related

How to connect R to SQL Azure?

I have two databases in Azure and each has 5 tables. I can perform data wrangling inside Azure using Kusto but I would rather prefer using RStudio. I wish to connect R with Azure such that I can run a script in R and return results without importing the actual datasets.
Please help, where do I start? I have zero knowledge of such connections.
Thank you in advance.
Assuming you have already installed R and RStudio. Please follow below steps:
Open ODBC Data Source through Start Window and Add a User Data Source under 'User DSN' as below. Follow through the next buttons until finish and test the connections.
Go to RStudio and create new connection, you should see the Data Source added in above step. You should see the connection and table listed under Azure Sql Database that you connected with.
Run the query like below in Console:
dbGetQuery(con, "SELECT * from dbo.xxxx")
You should be able to see the result set accordingly. You can play with queries in the way you want.

How to speed up loading data at start of shiny app

I'm pretty new to using shiny apps to visualize data. We plan to host our shiny app on our own server. So for that we used docker to deploy our app. However the app is super slow to load, since we have to load a lot of (big) dataframes (up to 10000000 rows x 10 columns), that are saved within a RData object.
My first question is: Will the data be loaded each time a user visits/reloads the website?
I was looking into ways how to spead up loading the data. One possbility might be to use the feather package, which seems to be faster in loading data tables.
Another option would be to put the data into a database. However I do not have experience with that. I saw there are some nice packages like DBI and RMariaDB that seem to work well with shiny app. However, I only find examples where an exterinal database is queried. Is it possible to pack a MySQL database within the docker and access it from within the shiny app? Or is the normal procedure to host the database externally?
I'm really new to all this, so I'm not even sure if I'm asking the right questions. These are the conditions: We have a lot of data in the form of multiple data tables. Those need to be read into our app quickly and needs to be queried quickly through interactive user input. We need to dockerize our app in order to deploy it. What is the best approach here?

Is it possible to manage R sessions?

Is it possible to manage R sessions, as in:
Connect your R console to an existing R session process?
Can two R sessions transfer data to one another?
One might desire this in the following likely scenario:
You're happily working on your R project and have generated data that took 3 hours to compute.
You decide to save your workspace in the case of a technical issue.
Upon saving your Rstudio decides to hang for eternity, however, leaving the R session unaffected.
In this scenario, you would want to
Connect to the R session with a terminal to retrieve your data anyway.
Setup another new R session that continuously synchronizes with the existing R session as a backup session.
Is it possible?
Connect your R console to an existing R session process?
Not possible.
Can two R sessions transfer data to one another?
Yes, there are multiple ways to do this. The general keyword for this is “inter-process communication”. You can use files, named pipes or sockets, for example. To serialise the data you can use either builtin functions (saveRDS, readRDS) or packages (e.g. feather).
But for your given use-case, there’s a much simpler solution:
Never rely on RStudio to save your R session. Instead, do so explicitly by calling saveRDS (or, to save the whole workspace, which I don’t generally recommend, save.image). In fact, the general recommendation is to disable the RStudio options for saving and restoring the session!
Make sure that your preferences look like this:

Trouble accessing recently created ODBC table

I have started using DBI and RODBC packages as a way to talk to the ODBC interface and write a dataframe as a ODBC table to be accessible during queries.
My problem is, while I can write a table by using either dbWriteTableor sqlSave, I can't access it.
When I explore the available tables on my ODBC connection, my test table appears in my personal schema but when I try to access it via SELECT or even desc the "table or view does not exist" error appears.
The problem is only accessing the database because I can properly update or remove the table using either ODBC R package or even using SQL Developer.
PS: If I create the table using the import function in SQL Developer I can properly access the table but my goal is to properly access it after writing it using an R function

R dataset connection to tableau

Recently tableau gave the functionality of R connection in their release 8.1. I want to know if there is any way i can call an entire table created in R to tableau. Or an .rds object which contains the dataset into Tableau?
There is a tutorial on the Tableau website for this and a blog on r-bloggers which discuss. The tutorial has a number of comments and one of them (in early Dec I think) asks how to get an rds file in. You need to start Rserve and then execute a script on it to get your data.
Sorry I can't be more help as I only looked into it briefly and put it on the back-burner but if you get stuck they seem to come back quickly if you post a comment on the page:
http://www.tableausoftware.com/about/blog/2013/10/tableau-81-and-r-25327
Just pointing out that the Tableau Data Extract API might be useful here, even if the current version of R integration doesn't yet meet your needs. (Note, that link is to the version 8.1 docs released in late 2013 - so look for the latest version to see what functionality they've added since)
If what you want to do is to manipulate data in R and then send a table of data to Tableau for visualization, you could first try the simple step of exporting the data from R as a CSV file and then visualizing that data in Tableau. I know that's not sexy, but its always good to make sure you've got a way to get the output result you need before investing time in optimizing the process.
If that gets the effect you want, but you just want to automate more of the steps, then take a look at the Tableau Data Extract API. You could use that library to generate a Tableau Data Extract instead of a CSV file. If you have something in production that needs updates, then you could presumably create a python script or JVM program to read your RDS file periodically and generate a revised extract.
Let us assume your data.frame/ tibble etc (say dataset object) is ready in R/ RStudio and you want to connect it with Tableau
1. In RStudio (or R terminal), execute the following steps:
install.packages("Rserve")
library(Rserve)
Rserve() ##This gets the R connection service up and running
2. Now go to Tableau (I am using 10.3.2):
Help > Settings and Performances > Manage External Service Connection
Enter localhost in the Server field and click on Test Connection.
You have now established a connection between R and Tableau.
3. Come back to RStudio. Now we need a .rdatafile that will consist of our R object(s). In this case, dataset. This is the R object that we want to use in Tableau. Enter this in the R console:
save(dataset, file="objectName.rdata")
4. Switch to Tableau now.
Connect To a File > Statistical File
Go to your working directory where the newly created objectName.rdata resides. From the drop down list of file type, select R files (*.rdata, *.rda) and select your object. This will open the object you created in R in Tableau. Alternatively, you can drag and drop your object directly to Tableau's workspace.

Resources