R Shiny Invalidatelater and Observe - r

I am attempting to refresh data from excel csv files as well as from webservice calls for a shiny app that i created. I understand the concept of invalidate later and reactivevalues(). However, there are many dataframes that rely on the initial ones created from the csv files and webservice calls and in order to update all of the downstream lists and dataframes I would have to put them all within observe blocks, make them globally defined since they are not within the server block, and prepend values$data to each dataframe - assuming I initialized values <- reactivevalues(). Is there an easier way to go about this to make sure all downstream dataframes are updated?

Related

How to re-intitialize reactive render values Shiny?

I work on a Shiny app with multiple reactive variables as well as output objects scattered in numerous tabItems.
The idea behind this app is that on the first page you download a selected dataset, through an action button, that will be used by the entire app. After having created multiple reactive values and output objects when the user will explore the app, I would like to know if there is a simple way to re-initialize the application (reactive variables, output objects) as soon as we download another dataset thanks to the previous action button I have mentionned earlier.
In fact, I want to delete the value of reactive variables and output objects from the memory every time I click on this specific button and DL a new dataset.
I want this app to be the same as it is when I launch my code for the first time.
Regards
Thibaut
I tried to use reactive values a lot but it has to be done for every variables.
Perhaps, there is a simpler way.

Save changes made to datatables after closing app (R)

I am making an R app that is keeping track of remaining vacation days of employees. On the first time launching it creates new tables while afterwards all important tables and variables will get stored in reactiveValues. Once I close my app and re-open it I don't want to create the tables from scratch again, I want to use the data that were once stored in my global variables as well. Is this possible?

Refresh data in shiny app using shared folder excel sheet

I've created a shiny app that creates a graph based on data that's entered in daily via a shared excel sheet (.xlsx) that is in a shared folder (an L drive).
How would I format or upload the data so that it is able to be refreshed whenever a new daily line of data is entered?
Here is one possible approach along with reference documentations:
Create a workflow to fetch the data using its URL:
read in online xlsx - sheet in R
Make the data retrieval process reactive:
https://shiny.rstudio.com/articles/reactivity-overview.html
Set a reactiveTimer to periodically check for updates:
https://shiny.rstudio.com/reference/shiny/1.0.0/reactiveTimer.html
By doing so, your app will fetch the document on a regular basis to update your graph. If you want real time updates (i.e. every time there is a change in the document), you have to be able to trigger the application from outside, which is more complicated (especially via Excel).
Update:
Following up your comment; you don't need the data to be online. You are fine if you are able to import it into R. Just make this process reactive and set a timer to refresh everyday (see the documentation for examples). Alternatively you can have an actionButton to refresh manually.

Can shiny app assign values to objects in R global environment?

I'm wondering if it's possible for a shiny app that's not run on the web (i.e. it's only run by a user launching it from their R session) to assign values to objects in the user's global environment. For example, suppose that as part of the app a data.frame is generated and, instead of using a download button to save the data.frame to a file, is it possible to assign it to an object in the user's R session so that when they close the app the data.frame is available to them?
What about automatically save the entire environment (using "save" function) to a temporary file? Then, you can just load it (using "load" function) and your data frame will be in the environment. All this process can be easily automated, without needing to use any save button.

How to keep the sequence file created by map in hadoop

I'm using Hadoop and working with a map task that creates files that I want to keep, currently I am passing these files through the collector to the reduce task. The reduce task then passes these files on to its collector, this allows me to retain the files.
My question is how do I reliably and efficiently keep the files created by map?
I know I can turn off the automatic deletion of map's output, but that is frowned upon are they any better approaches?
You could split it up into two jobs.
First create a map only job outputting the sequence files you want.
Then, taking your existing job (doing really nothing in the map anymore but you could do some crunching depending on your implementation & use cases) and reducing as you do now inputting the previous map only job through as your input to the second job.
You can wrap this all up in one jar running the 2 jars as such passing the output path as an argument to the second jobs input path.

Resources