Save changes made to datatables after closing app (R) - r

I am making an R app that is keeping track of remaining vacation days of employees. On the first time launching it creates new tables while afterwards all important tables and variables will get stored in reactiveValues. Once I close my app and re-open it I don't want to create the tables from scratch again, I want to use the data that were once stored in my global variables as well. Is this possible?

Related

Can I enter an existing DB into xamarin forms application?

I have some data contained in a CSV file, I need to efficiently access that information and want to importing it into my existing database.
I am wondering if I can make a pre-loaded database with the tables I need and then build the rest of the database on top of it (or make a second separate connection), or load the database from the CSV files on first startup.
What would be the preferred method and either way how would can I achieve it efficiently?
p.s 2 files are about 1000 lines long and 2 columns wide which seems to me to be considered fairly small... and the other ones really shouldn't be more then 10 lines long and 6-7 columns wide
Edit: realised I have a bunch of tables that need to be updated yearly, so any form that risks the user input data is unacceptable so using the existing DB is a not an option...

Get last update time of a gsheet in R without registering

I am working on a shiny app that creates a live report based on the data of a google spreadsheet. I only want to read the whole gsheet again if there are new values in it.
For this, it would be best if I could get the last update time of the gsheet and then decide based on that if I need to read in the data again.
I know that registering the gsheet gives the last update time, but registering takes more time than reading in the whole table if there are only a few values in the table.
Picture of the result of the microbenchmark comparison
Is there a way to get only the time of the last update without registering a gsheet again in R?

Creating SQite-tables while running a program

I have seen several members on this forum warn about creating tables in a database while the app is running. Instead it is encouraged to create the tables while programming, and only fill the tables with data during runtime.
E.g while creating a note-app it would be convenient to let the user specify a name for a single note, and let this note be created as a table in a database. This by creating the table at the time the user creates the note, and letting the name of the note be the name of the table. Why would this be a bad practise? Have I misunderstood?
It would be highly inconvenient for both you and the user of such an app to create a table for every note the user might want to add. It's just not the way it works. A table should contain rows of information of the same type, such as a note for example, and each note should be added as a row/record in the said table. The table should be called notes for example, and if you want a name for each note, it can be a column in the notes table called name.
An analogy would be, if you are taking notes manually (without an electronic device that is), would you have one notebook with you and just add notes on different pages as you need to, or would you carry around a bag full of notebooks so that whenever you want to add a new note, you would add each note in a separate notebook?
The notebooks being equivalent to database tables in this analogy, and the pages of the said notebook being equivalent to rows in a database table.
I can't think of a reason for creating tables during runtime really. The database structure should be "set in stone" so to speak, and during runtime you should only manipulate the data in the database, which is adding, deleting, or updating rows/records in already existing tables. Creating tables during runtime is a big no-no.
I probably don't understand your question correctly, but it seems to me, that what you really want to do is to create a new row in a table I the database?

Excel report using access db (~2gb) as backend substitutable by R with shiny or markdown?

I currently have an Access database which pulls data from an Oracle for various countries and is currently around 1.3 GB. However, more countries and dimensions should be added, which will further increase its size. My estimate would be around 2 GB, hence the title.
Per country, there is one table. These tables are then linked in a second Access db, where the user can select a country through a form which pulls the data from the respective linked table in Access db1, aggregates it by month, and writes it into a table. This table is then queried from Excel and some graphs are displayed.
Also, there is another form where the user can select certain keys, such as business area, and split the data by this. This can be pulled into a second sheet in the same excel file.
The users would not only want to be able to filter and group by more keys, but also be able to customize the period of time for which data is displayed more freely, such as from day xx to day yy aggregated by week or month (currently, only month-wise starting from the first of each month is supported).
Since the current Access-Access-Excel solution seems to me to be quite a cumbersome one, I was wondering whether I might make the extensions required by the users of this report using R and either shiny or markdown. I know that shiny does not allow files larger than 30MB to be uploaded but I would plan on using it offline. I was just not able to find a file size limit for this - or do the same restrictions apply?
I know some R and I think that the data aggregations needed could be done very fast using dplyr. The problem is that the users do not, so the report needs to be highly customizable while requiring no technical knowledge. Since I have no preexisting knowledge of shiny or markdown, I was wondering whether it was worth going through the trouble of learning one enough to implement this report in them.
Would what I want to do be feasible in shiny or R markdown? If so, would it still load quickly enough to be usable?

Best format to store incremental data in regularly using R

I have a database that is used to store transactional records, these records are created and another process picks them up and then removes them. Occasionally this process breaks down and the number of records builds up. I want to setup a (semi) automated way to monitor things, and as my tool set is limited and I have an R shaped hammer, this looks like an R shaped nail problem.
My plan is to write a short R script that will query the database via ODBC, and then write a single record with the datetime, the number of records in the query, and the datetime of the oldest record. I'll then have a separate script that will process the data file and produce some reports.
What's the best way to create my datafile, At the moment my options are
Load a dataframe, add the record and then resave it
Append a row to a text file (i.e. a csv file)
Any alternatives, or a recommendation?
I would be tempted by the second option because from a semantic point of view you don't need the old entries for writing the new ones, so there is no reason to reload all the data each time. It would be more time and resources consuming to do that.

Resources