To save data generated in a R shiny app in a .csv file in Amazon S3 bucket - r

I have a working R shiny app which is being hosted from our internal org Amazon AWS server. Now, Users will generate data on that Shiny app using different widgets provided there. Now, I need to save all the data generated in one session into a file that is stored in our internal Amazon S3 bucket.
The possible challenge that we are facing is how to save these data when multiple users could generate data using that Shiny App and then need to be saved and reloaded back to the Shiny app further if needed.
We just can't lose any data even if two users simultaneously add data using the Shiny App.
Please guide what could be our best approach.
I did follow the guideline provided here :
https://deanattali.com/blog/shiny-persistent-data-storage/
But, is there a way where we don't need to create as many .csv files as we have a number of users accessing the app.

Related

Programatically transfer file to R Shiny application

I have the following situation/workflow:
The user utilizes Tool A to capture sensor data and save it to a CSV file.
The user takes the CSV and uploads it to a R Shiny Applications (using fileInput) to process it further.
I would like to get rid of the intermediate CSV and directly open the Shiny application with the data already loaded. I.e. I need a way to transfer the contents of the CSV automatically to the Shiny application.
What could work:
Tool A stores the CSV in a special location that is served by a HTTP server on a fixed endpoint.
The Shiny application requests the file from the known location on startup.
However, this still needs the intermediate CSV file, and adds complexity by introducing an additional server (or at least endpoint). Furthermore it needs additional logic if multiple users are active / multiple files are created at the same time. So it is far from ideal.
Is there any way to get the contents of the CSV directly from Tool A to the Shiny Application? E.g. by mimicking the messages the 'fileInput' widget produces?

How to speed up loading data at start of shiny app

I'm pretty new to using shiny apps to visualize data. We plan to host our shiny app on our own server. So for that we used docker to deploy our app. However the app is super slow to load, since we have to load a lot of (big) dataframes (up to 10000000 rows x 10 columns), that are saved within a RData object.
My first question is: Will the data be loaded each time a user visits/reloads the website?
I was looking into ways how to spead up loading the data. One possbility might be to use the feather package, which seems to be faster in loading data tables.
Another option would be to put the data into a database. However I do not have experience with that. I saw there are some nice packages like DBI and RMariaDB that seem to work well with shiny app. However, I only find examples where an exterinal database is queried. Is it possible to pack a MySQL database within the docker and access it from within the shiny app? Or is the normal procedure to host the database externally?
I'm really new to all this, so I'm not even sure if I'm asking the right questions. These are the conditions: We have a lot of data in the form of multiple data tables. Those need to be read into our app quickly and needs to be queried quickly through interactive user input. We need to dockerize our app in order to deploy it. What is the best approach here?

How can I publish an R Markdown Shiny Dashboard privately?

So for context, I have created a dashboard using Rstudio by using flexfdashboard. I is an R Markdown Shiny app with ui and source code all within the same file.
Data sources for this app are large, locally saved documents in. Txt and. Csv format, with sizes ranging from a few kilobytes to several gigabytes.
I am to present this work to my CEO but I will not have access to my current computer and so ideally would host my dashboard somewhere and simply load the dashboard in browser.
However, the datasets contain sensative customer information and so privacy or controlled access to all assets is essential, ruling out most 3rd party solutions.
Is there a simple method to achieve this that im missing, or will I have to just run it from a local machine?
Thanks

Data Storage in R

So I wrote a shiny app in R and part of the script includes calling a web service to collect data. I run the script on a daily basis to build the shiny app. I would like to show data collected from the web service over time, however the only way I know how would be to write the data to a csv file after every call to the web service and then load that csv file back into R to show how the data has changed over time. This may seem like a strange question, but is there a way to accomplish this solely in R so that the data is stored every time the script is run and new values from the web service call are apended?

Deploy Shiny app that can read files from local computer (no possible Data-based solution)

I am trying to deploy a Shiny app on shinyapps, but it is not possible to read and write local files from the computer of the user. The idea is that every user can upload some datasets and images given a local path where this input is stored, not to have a same series of inputs for everyone stored in a ´Data´ folder.
I know there are other options to deploy shiny apps (amazon, shiny server), but I am afraid I will find the same kind of problems. Before losing too much time trying other approaches, I would like to know if there is any way to deploy a shiny app that can read these inputs given a local path and, if not, if there is an easy way to prepare an online app that can do this trasnlating from a shiny-based one. If not, I guess I will have to leave my app as a normal R package.
Thank you in advance.

Resources