R: How to shut down database in flexdashboard? - r

I have a flexdashboard which queries from a local DuckDB (similar to SQLite) but every so often it says:
Warning: Error in initialize: duckdb_startup_R: Failed to open database
There is no "onStop dbdisconnect" style command in my code (I'm not sure where to put it in flexdashboard) so it doesn't shut down correctly if my R instance crashes. My only solution is to restart the computer. How can I fix this?

Related

R Studio keeps 'hanging' when trying to establish an Oracle Database connection

I'm working a little side project on R Studio and I'm trying to import data from an Oracle database. The problem is, whenever I try to establish the connection using the DBI::dbConnect command, it just 'hangs'. It won't continue to the next command on my R Studio script. I've added a timeout to the dbConnect command, but it doesn't help anything. In order to exit, I have to shut down R Studio completely.
I've tested the connection, check the screenshot below, using the 'Connections' tab on R Studio. As you can see, it is able to establish the connection. So that should mean the parameters are set correctly, right? But when I run it on the script, it just keeps 'hanging' on the dbConnect command.
What can I do?
When it connects successfully to the database RStudio loads in connection pane a tree which present all schemas and all tables. It's useful by it can be slow.
Depending on Oracle database I use, it often takes a long time to load this and it's longer when user have low privileges on database (user with only one schema for instance).
I see two ways trying to go further:
connect with your statement on RGui to see if it works correctly then it's really linked to RStudio's connection pane
connect with system / admin or a higher level user to check if it's work better with

How to use psql in R? copy_to function of the "rpg" package not working

I am connecting to a PostgreSQL database and I would like to make use of psql commands (especially the \copy command) from within R.
I’m on a windows client using ODBC drivers to connect to the database. Basically any of the major ODBC package in R, including the „rpg“ package, is working to connect to the database and to read and write tables etc.
Apart from placing regular SQL queries the „rpg“ package allows to make use of psql commands. The „copy_to“ function should send the psql „\copy“ command to the database.
However, when running the function I get the error: „psql not found“
I also have pgAdmin III installed. Here running the \copy command is no problem at all.
Digging deeper I found that the rpg::copy_to function first runs Sys.which(„psql“) which returns: "" leading to said error.
Reading this thread made me think that adding the path to the pgAdmin psql.exe would do the trick. So I added the line
psql=C:\Program Files (x86)\pgAdmin III\1.16\psql.exe
in the R environment.
Running Sys.which(„psql“) still returns "", while Sys.getenv() correctly shows the path to the pqsl.exe that I specified.
How can I make Sys.which() find the psql.exe? Given that’s the correct way to solve this issue in the first place.
I would appreciate any help!

Script using GPU throws error when running app on Shiny server

I have written some code in R using the Shiny package. The app runs alright when running shiny::runApp() from RStudio.
I've tried to host the app on a Shiny server and the GUI starts up correctly. However, a plot should appear on the right when the button "Enviar informacion" is clicked. (You can see how the app looks here, but it won't work though because that link is not on Shiny server but on Shinyapps.io).
The relevant part of the logfile is the following:
Listening on http://127.0.0.1:38327
Loading required package: Rcpp
Warning: Error in [: subscript out of bounds
211: FUN [/srv/shiny-server/spike_sorting/server.R#82]
210: apply
209: cluster_som_h [/srv/shiny-server/spike_sorting/server.R#82]
...
So the error is inside the function cluster_som_h. After trying some stuff, I found out what is causing the error, but don't know how to fix it. Inside this function, I make use of the function Rsomoclu.train() from the package Rsomoclu, with kernelType = 1. This means that the function is run on the GPU. If I set kernelType = 0, then the app works okay with both runApp() and from the Shiny server. So that's where the problem is: for some reason, running the function on the GPU is not working if the app is run from the Shiny server, but it does work if it is run from RStudio using runApp().
Maybe there is something going on because I have CUDA installed on my computer but when I try to run it from the server something happens. Is there a way to fix this? I believe running the code from this specific computer no matter where on the LAN the app is run would solve this. Is this possible?
After doing some (basic) research, I found out what was happening. I have CUDA installed only in my user in Ubuntu, so I had to change the first line of the file /etc/shiny-server/shiny-server.conf so that the user was the right one:
# Instruct Shiny Server to run applications as the user "my_name"
run_as my_name;

Redshift JDBC connection crashes on second opening in R

I am using the RJDBC package to connect to AWS Redshift from an EC2 ubuntu instance.
I can successfully connect using the JDBC() call, retrieve/insert rows and then close the connection.
However, when I re-open a second connection in the same R session, R crashes with a segmentation fault. This happens in both R Studio and console R. I'm using conda to manage the R.
I have tried the connection using the native redshift jar provided by Amazon and also another jar from Progess Software. I get the same effect with both drivers: first connection is fine, subsequent connections crash.
I've installed the latest JVM v8. I had seen some other threads that suggested installing v6 as a workaround, but unfortunately that is no longer available at the oracle site.
My gut feeling is that Java has a weird interaction with R, but I'm at a loss as to how to proceed.
OK, I solved this myself and thought I'd record in case this is useful to others.
The problem was really with rJava not re-initialising the JVM correctly.
I added the following line before opening a database connection:
rJava::.jinit(force.init = TRUE)
Now I can open and close connections without issue using RJDBC

Error occurred while checking for updates.Unable to establish connection with R session Rstudio

I am unable to load an R project and I am getting an error: Error occurred while checking for updates. Unable to establish a connection with R session with R studio. I am using large data sets inside that projects which might be causing this issue. Is there a way I could log in from the terminal and remove some datasets from the workspace. Can someone help how I can log in?
It sounds like you're auto-saving the workspace to .RData. It's usually not a problem (though still not advised) to do so with small-ish data sets/objects in the workspace but it's almost deadly to use this setting if you work with large data sets/objects often.
Disable the Restore .RData into workspace at startup setting in RStudio preferences and also set Save workspace to .RData on exit to Never.
Hunt down all the .RData files in your various working directories and delete them, too.
I had the same problem. And the above proposed strategy did not work well. However, having anaconda installed in my PC I downloaded Rstudio on that platform. Contrarily to the other Rstudio standalone version, Rstudio in anaconda works well. I do not know why, but in case of need you can smash out your Rstudio without deleting your files and use the Rstudio on the anaconda platform. Just to have another point of view!

Resources