Every time I close and open the RStudio, everything in the panels including all the data frames, functions, values, etc. vanishes and some very old ones that I have deleted long ago appears. I save workspace when I want to close it, but this happens every time. Importing my large dataset and generating everything again every time takes a lot of time. What can I do?
You can save your workspace and restore it under Tools -> Options -> General.
Please see picture below.
In addition you can also use:
save.image(file='Session.RData')
And load it later:
load('Session.RData')
However, generally speaking, some consider it bad to keep/save your environment/workspace.
Related
I am currently looking to write a function in R that can keep track of the number of completed runs of an .R file within any particular day. Notice that the runs might be conducted at different time periods of a day. I did some research on this problem and came across this post (To show how many times user has run the script). So far I am unable to build upon the first commenter's code when converting into R (the main obstacle is to replicate the try....except ). However, I need to add the restriction that the count is measured only within a day (exactly from 00:00:00 AM EST to 24:00:00 AM EST).
Can someone please offer some help on how to accomplish this goal?
Either I didn't get the problem, or it seems a rather easy one: create a temporary file (use Sys.Date() to name it) and store the current run number there; at the beginning of your .R file, read the temporary file, increment the number, write the file back.
I'm currently working on an R Shiny App that utilizes googlesheets4 to pull in a large dataset from GoogleSheets upon app launch. Loading in this dataset to my app takes ~2 minutes, which stalls my entire app's load time.
The only visual in my app is based on this GoogleSheets data, so it is very dependent on this specific dataset. Once the dataset gets pulled into my app, it is filter and therefore becomes much smaller (85,000 rows ---> 1,000 rows). This GoogleSheet data is updated every day, so I don't have the luxury of pre-downloading it once and storing it as a .csv forever.
There are two different fixes for this that I have tried but have been unsuccessful...curious if anyone has any thoughts.
Have a separate app running. My first idea was to create a separate Shiny app entirely, that would have a sole purpose of pulling the GoogleSheets df once a day. Once it pulls it, it would conduct the necessary data cleaning to get it down to ~1,000 rows, and then would push the smaller df to a different GoogleSheet link. Then, my original app with the visual would just always reference that new GoogleSheet (which would take much less time to load in).
The problem I ran into here is that I couldn't figure out how to write a new GoogleSheets doc using googlesheets4. If anyone has any idea how to do that it would be much appreciated.
Temporarily delay the load in of the GoogleSheets data, and let visual populate first. My second idea was to have the code that pulls in the GoogleSheets df be delayed upon launch, letting my visual first populate (using old data) and then have the GoogleSheets pull happen. Once the pull is complete, have the visual re-populate with the updated data.
I couldn't figure out the best/right way to make this happen. I tried messing around with sleep.sys() and futures/promises but couldn't get things to work correctly.
Curious if anyone has any thoughts on my 2 different approaches, or if there's a better approach I'm just not considering...
Thanks!
There is a function called write_sheet that allows you to write data to a google sheet. Does that work for you?
googlesheets4::write_sheet(data = your_data,
ss = spread_sheet_identifier,
sheet = "name_of_sheet_to_write_in")
If you on only want to add something without deleting everything in the sheet, the function is sheet_append
googlesheets4::sheet_append(data = your_data,
ss = spread_sheet_identifier,
sheet = "name_of_sheet_to_write_in")
Not sure you can store the credentials in a save way, but couldn't you use github actions? Or alternatively a cron job on your local computer?
In short, I need to reset the playhead back to frame 1 at the time of the script running.
I'm creating a script for exporting a handful comps after updating some text.
One export will be a jpg, and then 2 short videos. I am using app.executeCommand(2104) which is Save Frame As to add the current frame to the render queue for the jpg export. Otherwise, AE would be trying to export a jpg sequence even if only 1 frame long. This affects the output name and the settings of the export. I haven't found an easy way of avoiding the added formatting.
I will be giving this to inexperienced coworkers, who will surely forget to reset the playhead before hitting my export button. So I was trying to force the playhead back to the beginning within the script.
Export Changes
I have tried updating the output render settings with a Time Span Start and Time Span Duration but that changed it back into the jpg sequence.
I thought I could trick/hack it by creating a new comp and then deleting. since when doing it by hand it moves the playhead to zero of the newly created comp. however, when changing focus back to the JPG comp the playhead jumped back to where it was originally.
I have searched through both of the official Adobe guides for scripting and the usual net forums but I haven't found a single command that works for moving the playhead other than by hand. I'm hoping I just missed something obvious.
The playhead-time-thingy-line is called the CTI (Current Time Indicator) in AE-Land. So this should work.
app.project.activeItem.time = 0;
(How) can I move the CTI from within a script?
I had a small but important R file that I have been working on for a few days.
I created and uploaded a list of about 1,000 ID's to SQL Server the other day and today I was repeating the process with a different type of ID. I frequently save the file and after having added a couple of lines and saved, I ran the sqlSave() statement to upload the new ID's.
RStudio promptly converted all of my code to gibberish and froze (see screen shot).
After letting it try to finish for several minutes I closed RStudio and reopened it. It automatically re-opened my untitled text files where I had a little working code, but didn't open my main code file.
When I tried to open it I was informed that the file is 55 Megabytes and thus too large to open. Indeed, I confirmed that it really is 55MB now and when opening it in an external text editor I see the same gibberish as in this screnshot.
Is there any hope of recovering my code?
I suppose a low memory must be to blame. The object and command I was executing at the time were not resource intensive, however a few minutes before that I did retrieve an overly large dataframe from SQL Server.
You overwrote your code with a binary representation of your objects with this line:
save.image('jive.R')
save.image saves the R objects, not your R script file. To save your script, you can just click "File->Save". To save your objects, you would have to put that in a different file.
I created a very small script (without saving) in RCmdr top window, but I only saved the workspace.
When I reload this I can't see anything that was in the top window originally. My mistake I know, but is there a way to see any hint of the functions etc I may have called, from the workspace file? I can see the objects - but not what created them.
If you open a new R session, try hitting the up-arrow keys. The normally invisible .Rhistory file is usually loaded at the start of a new session if the prior session ended normally. If the session is open in a GUI hten you may be able to display the list of commands with a menu command. This may also display that file:
loadhistory(file = ".Rhistory")
The history is cumulative, so unless you had a really long session intervening you may still be able to get code going back for several session. I think it keeps the last 500 entries by default. Actually turns out to be 512. See:
?history