I have a Shiny application that needs to load into memory some fairly large data sets. To save the users some time when browsing to the dashboard, I set the app_idle_timeout to zero (using the community version of the Shiny server application), as suggested in the docs. This works as expected.
However, the underlying data needs to be refreshed daily. Hence, what I would like to do is setting up a cron job that reboots the shiny server (or stops the relevant sessions) every day at 3am and then automatically initiates a new R session so that the data in the global.R is loaded into memory and the dashboard ready to consume instantly.
What I do not understand is how to initiate a particular Shiny application from terminal, i.e. mimic what happens when browsing to the URL of this app on the Shiny server.
Any suggestion would be greatly appreciated.
Related
I am currently hosting a number of shiny apps on a shiny-server open source on a centos7 distro. All the apps were/are working fine but for one of the apps when I try to go to the url I get the following message:
This page isn't working
<ip.address> did not receive any data.
ERR_EMPTY_RESPONSE.
All the other shiny apps hosted on the same shiny server are working just fine. I checked /var/log/shiny-server and there is no log file for this App. As the other apps are working fine, I dont think its a port issue.
The only difference between other apps and this one is that it was used the most by its users. Is there some restriction/limit on shiny-server for runtime? I can't figure out what the problem is. The app runs fine on RStudio Server and if I copy it into a new directory in /srv/shiny-server/ with a different name, it also works fine.
A couple of thoughts:
If a process closes successfully, then shiny deletes the log files. So it's possible you may be missing some log files. You may override this with preserve_logs, see here. Your users may be triggering some error through their interactions with the app, but other sessions are successful, so shiny deletes the log files.
Shiny creates one process per app be default, but an unlimited number of session (see here). This means that if your app is the one that is used the most by users, each user is generating a new session. And if the app is computationally intensive, then some of the user sessions may be getting backlogged which might trigger the ERR_EMPTY_RESPONSE. You can fix this by using Docker to spin up a process for each user. Here are some options, I've found shinyproxy to be the most intuitive.
I created a Shiny app which is doing a long computation and therefore i am running it on a local server in my Network.
I can access the app via my computer, upload files and start the calculation. But when I close the browser and access the app via a weblink again, Shiny will start a new empty session.
How can I reconnect to the "closed" session with the hopefully still running computation and uploaded files?
It seems that it may be possible with RStudio Connect or running the app via a ShinyServer. Is there another easy way to handle it?
Reconnecting to the exact same session seems not possible. I found after a lot of research the Bookmark function, which saves your Input inside a new URL or on a Server. See references here: Link
Unfortunately, it didnt work really good because i had bigger plots and tables as saved output.
I solved the problem with a workaround. In the end of the computation i save the whole environment with save.image(file='myEnvironment.RData')
I added an actionbutton and by clicking it loads the saved environment with load('myEnvironment.RData'). After this, all files are back in the environment and new outputs can be created by e.g. output$xy <- renderPlot({XY}).
Hope that helps
There are a few R scripts that several analysts in an organization need to use in their daily life in order to get some analysis and reports.
I was planning to create a Shiny app so that the scripts would be able to run on a server with more memory and processing power, as well as for having some validation of the inputs with selectable or dropdown fields, as now the inputs are typed manually and this creates a lot of errors.
But there are a few problems with this approach (as far as I know):
1) Shiny disconnects when users closes browser or connection, what would finish the execution of the scripts (they can take several hours to run)
2) Once a script is launched it blocks the whole Shiny for doing other tasks (I have unsuccessfully tried to set up parallel computing... we have Windows servers).
Are there any solutions for the issues above to keep on using Shiny?
Maybe Shiny is not the best system for this? Are there any alternatives?
Thanks
I can't tell you for shiny, but in general there are 2 tools I can recommend for it: rundeck and script-server. Both of them can execute any type of scripts and provide web UI for the script with required inputs, script output, logging, etc.
disclaimer: I'm the owner of the script-server project
I have a locally hosted shiny application. The application pulls data from a data feed and allows plotting and elementary stats. However, I am having trouble getting this app to refresh. I tried setting Task Scheduler to run the following every 6-hours:
ip="10.30.27.127"
runApp("//COB-LOGNETD-01/LoggerNet/R_Scripts/shiny",host=ip,port=4438)
However, despite this executing, it does not refresh. Is there a way to get the shiny app to refresh automatically?
I have a shiny app that continuously runs on a server, but this app uses SQL data tables and needs to check for updates once a day. Right now, with no batch file in place, I have to manually stop the app, run an R script that checks for these updates, then re-run the app. The objects that I want to update are currently stored in RStudio's global environment. I've been looking around at modifying .RData files because I'm running out of options. Any ideas?
EDIT: I'm aware that I probably have to shut down the app for a few minutes to refresh the tables, but is there a way I can do something like this using a batch file?