Automatically Refreshing R Shiny Application - R - r

I have a locally hosted shiny application. The application pulls data from a data feed and allows plotting and elementary stats. However, I am having trouble getting this app to refresh. I tried setting Task Scheduler to run the following every 6-hours:
ip="10.30.27.127"
runApp("//COB-LOGNETD-01/LoggerNet/R_Scripts/shiny",host=ip,port=4438)
However, despite this executing, it does not refresh. Is there a way to get the shiny app to refresh automatically?

Related

ERR_EMPTY_RESPONSE shiny server

I am currently hosting a number of shiny apps on a shiny-server open source on a centos7 distro. All the apps were/are working fine but for one of the apps when I try to go to the url I get the following message:
This page isn't working
<ip.address> did not receive any data.
ERR_EMPTY_RESPONSE.
All the other shiny apps hosted on the same shiny server are working just fine. I checked /var/log/shiny-server and there is no log file for this App. As the other apps are working fine, I dont think its a port issue.
The only difference between other apps and this one is that it was used the most by its users. Is there some restriction/limit on shiny-server for runtime? I can't figure out what the problem is. The app runs fine on RStudio Server and if I copy it into a new directory in /srv/shiny-server/ with a different name, it also works fine.
A couple of thoughts:
If a process closes successfully, then shiny deletes the log files. So it's possible you may be missing some log files. You may override this with preserve_logs, see here. Your users may be triggering some error through their interactions with the app, but other sessions are successful, so shiny deletes the log files.
Shiny creates one process per app be default, but an unlimited number of session (see here). This means that if your app is the one that is used the most by users, each user is generating a new session. And if the app is computationally intensive, then some of the user sessions may be getting backlogged which might trigger the ERR_EMPTY_RESPONSE. You can fix this by using Docker to spin up a process for each user. Here are some options, I've found shinyproxy to be the most intuitive.

Automatically restart Shiny apps after server reboot

I have a Shiny application that needs to load into memory some fairly large data sets. To save the users some time when browsing to the dashboard, I set the app_idle_timeout to zero (using the community version of the Shiny server application), as suggested in the docs. This works as expected.
However, the underlying data needs to be refreshed daily. Hence, what I would like to do is setting up a cron job that reboots the shiny server (or stops the relevant sessions) every day at 3am and then automatically initiates a new R session so that the data in the global.R is loaded into memory and the dashboard ready to consume instantly.
What I do not understand is how to initiate a particular Shiny application from terminal, i.e. mimic what happens when browsing to the URL of this app on the Shiny server.
Any suggestion would be greatly appreciated.

Shiny reconnecting to existing session

I created a Shiny app which is doing a long computation and therefore i am running it on a local server in my Network.
I can access the app via my computer, upload files and start the calculation. But when I close the browser and access the app via a weblink again, Shiny will start a new empty session.
How can I reconnect to the "closed" session with the hopefully still running computation and uploaded files?
It seems that it may be possible with RStudio Connect or running the app via a ShinyServer. Is there another easy way to handle it?
Reconnecting to the exact same session seems not possible. I found after a lot of research the Bookmark function, which saves your Input inside a new URL or on a Server. See references here: Link
Unfortunately, it didnt work really good because i had bigger plots and tables as saved output.
I solved the problem with a workaround. In the end of the computation i save the whole environment with save.image(file='myEnvironment.RData')
I added an actionbutton and by clicking it loads the saved environment with load('myEnvironment.RData'). After this, all files are back in the environment and new outputs can be created by e.g. output$xy <- renderPlot({XY}).
Hope that helps

R shiny app: Provide API-like interface for other calling applications

I have a running shiny app that displays some output-metrics on screen in response to purely on-screen UI inputs (no files to be read etc.) I can host it on shinyapps.io to make it accessible remotely 24x7.
A programmer working on another application (non-R, residing on a remote server) wants to access my shiny app to get my output-metrics into his code. What's a good way to do this?
On the input side, with bookmark-able state via the calling url it looks like I can get all inputs from the calling app.
What I haven't figured out is any way to send back outputs to the calling app with the current shiny framework. What I currently show on screen I want to be able to send back to the calling app.
Any ideas?
PS. There's rumors that some native functionality to provide this use-case is coming to shiny but there's nothing yet and I'm impatient to finish this project.

Shiny R: Updating global environment objects from a batch file on a server application?

I have a shiny app that continuously runs on a server, but this app uses SQL data tables and needs to check for updates once a day. Right now, with no batch file in place, I have to manually stop the app, run an R script that checks for these updates, then re-run the app. The objects that I want to update are currently stored in RStudio's global environment. I've been looking around at modifying .RData files because I'm running out of options. Any ideas?
EDIT: I'm aware that I probably have to shut down the app for a few minutes to refresh the tables, but is there a way I can do something like this using a batch file?

Resources