How to do a weekly refresh for R shiny data - r

I have an app deployed on shinyapps.io . It reads off a CSV in the global.R that is created from another r script that pulls data from various databases and merges and cleans the data. How can I refresh the CSV and republish the app on a weekly basis so the data is refreshed? I’m building this for a client, so I don’t want it run locally.

Related

How to automatically update Shiny app and Excel dataset on shinyapps.io periodically?

I have a very simple Shiny app whose app directory only contains app.R and Data.xlsx that I manually publish to shinyapps.io at the end of every day after entering new data into the Excel file. Is there any way to automate this process so that the app and Excel file are automatically deployed every x hours? This is on Windows 10.

To save data generated in a R shiny app in a .csv file in Amazon S3 bucket

I have a working R shiny app which is being hosted from our internal org Amazon AWS server. Now, Users will generate data on that Shiny app using different widgets provided there. Now, I need to save all the data generated in one session into a file that is stored in our internal Amazon S3 bucket.
The possible challenge that we are facing is how to save these data when multiple users could generate data using that Shiny App and then need to be saved and reloaded back to the Shiny app further if needed.
We just can't lose any data even if two users simultaneously add data using the Shiny App.
Please guide what could be our best approach.
I did follow the guideline provided here :
https://deanattali.com/blog/shiny-persistent-data-storage/
But, is there a way where we don't need to create as many .csv files as we have a number of users accessing the app.

Data Storage in R

So I wrote a shiny app in R and part of the script includes calling a web service to collect data. I run the script on a daily basis to build the shiny app. I would like to show data collected from the web service over time, however the only way I know how would be to write the data to a csv file after every call to the web service and then load that csv file back into R to show how the data has changed over time. This may seem like a strange question, but is there a way to accomplish this solely in R so that the data is stored every time the script is run and new values from the web service call are apended?

Updating a CSV file once uploaded to private Shiny Server

Question is targeted at users experienced with shiny local server usage
My Shiny app uses a large csv file to read all its data from to prevent it from running a SQL command to get the same data everytime.
My question is once I have the app and all related files uploaded to my local shiny app server. Will I be able to locally write out from R to the csv file on my server (updating it with fresh data). Without having to re upload the shiny server app with the new data files?
Apologies if this isn't in everyone's most preferred way to ask questions around here but its just a simple question to find out if it can be done.

Shiny R: Updating global environment objects from a batch file on a server application?

I have a shiny app that continuously runs on a server, but this app uses SQL data tables and needs to check for updates once a day. Right now, with no batch file in place, I have to manually stop the app, run an R script that checks for these updates, then re-run the app. The objects that I want to update are currently stored in RStudio's global environment. I've been looking around at modifying .RData files because I'm running out of options. Any ideas?
EDIT: I'm aware that I probably have to shut down the app for a few minutes to refresh the tables, but is there a way I can do something like this using a batch file?

Resources