Using SQL Server db as data source for R Shiny application - r

I'm new to R Shiny and I would like to host my application on shinyapps.io. My app.R file gets data from my local SQL database, then I use R to manipulate that data for my R Shiny app. When I upload to shinyapps.io I get error code 1 that seems rather ambiguous based on my google searches.
The current workflow for my project is:
Scrape an API and clean/manipulate the data in R.
Using R and the dbWriteTable function, write the cleaned data into my local SQL Express db. Some data is overwritten, some is appended.
Steps 1 & 2 are automated to run either hourly or daily using Windows task scheduler.
Access the data in SQL and store in R for my app using the following code for the various SQL tables. I am not using any SQL queries in R to manipulate the SQL data before it is stored in R.
con <- dbConnect(odbc(),
Driver = "SQL Server",
Server = "laptop\\SQLEXPRESS",
Database = "myDB",
Trusted_Connection = "True")
tradingLog <- as.data.frame(tbl(con,"tradingLog"))
The above code is being repeated within the app.R shiny script for different SQL tables to supply the data for my app, but my novice understanding is that having the local database will be a problem when I want to host the app on shiny apps online. The application is fully functional running on my computer but breaks when on shinyapps, so that has led me to believe I would need to host my SQL database.
To accomplish this I think I can use Google Cloud, which I have found some resources on how to import my SQL database. If Google Cloud is a viable option, I would prefer to use it for the project because my application is using RgoogleMaps so I already setup the Google Cloud account. From here I assume I can change the driver and server settings in the dbConnect code above and this may address the shinyapps problem?
Any advice or direction on how address the problem would be appreciated. My goal is to be able to share this project with team members via shinyapps.io while accessing the SQL database I have setup that has the various scraping tasks scheduled which feed new data into the database.

Try this Database basics - dplyr and DBI

Related

RODBC through front end to SharePoint

I have an R script that uses the RODBC package to connect to an .accdb file allowing me to pull data directly into R. The .accdb file was recently reconfigured so the tables reside on SharePoint and we access the table through the front-end .accdb file. Now, when I connect to the front-end I am no longer able to pull data from the tables/queries. If I run sqlFlecth() I get an error "42S02 - 1305" that says it could not find the object InsertTableOrQueryName and says if the object is not a local object to check my network connections or contact the server administrators. I have a network connection - I can open the front-end and run the queries/access the tables. My administrators aren't any help. Any know how I can get this working again so I don't have to keep opening the front-end running the queries and saving to an intermediate Excel file or something?

How to make Azure batch see my data files?

I am new to Azure batch. I am trying to use R in parallel with Azure batch in rstudio to run code on a cluster. I am able to successfully start the cluster and get the example code to work properly. When I try to run my own code I am getting an error that says the cluster nodes cannot find my data files. Do I have to change my working directory to Azure batch somehow?
Any information on how to do this is much appreciated.
I have figured out how to get Azure batch to see my data files. Not sure if this is the most efficient way, but here is what I did.
Download a program called Microsoft Azure Storage Explorer which runs on my local computer.
Connect to my Azure storage using the storage name and primary storage key found in the Azure portal.
in Microsoft Azure Storage Explorer find Blob containers, right click create new container.
Upload data files to that new container.
Right click on data files and go to copy URL.
Paste URL in R like this model_Data<-read.csv(paste('https://<STORAGE NAME HERE>.blob.core.windows.net/$root/k',k,'%20data%20file.csv',sep=''),header=T)

How to connect R to SQL Azure?

I have two databases in Azure and each has 5 tables. I can perform data wrangling inside Azure using Kusto but I would rather prefer using RStudio. I wish to connect R with Azure such that I can run a script in R and return results without importing the actual datasets.
Please help, where do I start? I have zero knowledge of such connections.
Thank you in advance.
Assuming you have already installed R and RStudio. Please follow below steps:
Open ODBC Data Source through Start Window and Add a User Data Source under 'User DSN' as below. Follow through the next buttons until finish and test the connections.
Go to RStudio and create new connection, you should see the Data Source added in above step. You should see the connection and table listed under Azure Sql Database that you connected with.
Run the query like below in Console:
dbGetQuery(con, "SELECT * from dbo.xxxx")
You should be able to see the result set accordingly. You can play with queries in the way you want.

Connecting to BigQuery using ODBC in Qlikview

I have the latest BigQuery ODBC driver installed and setup according to the instruction here
I was able to follow the tutorial and access the data in MS Excel.
However in Qlikview I was unable to see any tables when using the same ODBC connection.
The ODBC driver is functional. What I didn't notice was that I didn't have any dataset created under the BigQuery test project, hence no table was available.
It is still possible to utilize QlikView to access the public data set by adding the query strings in the scripts after the ODBC CONNECT line.
QlikView Edit Script screen

ODBC. Able to test connection but unable to create dsn

Im trying to clone an database using ODBC. However I do not have the Windows login or password to create the DSN. I am able to make a successful test connection through PHP AND ODBC.
Is there anyway to dump an database or perhaps make it to a CSV safely using PHP? Or is there anyway around this without the windows auth?
If you are successfully connecting with PHP, then you should be able to use PHP to perform any ODBC calls necessary for duplicating or cloning the database supported via SQL Commands.
Since PHP is able to perform file io, you can choose to store the data from the database in any format you prefer; including csv.

Resources