I collected a 2.5GB matrix with R, and upon its completion, I accidentally passed a view command to the RStudio
View(Matrix)
So the RStudio stuck and force quit. I lost all the data. Is there any possibility that R could have stored some of the data somewhere? If yes, where could I find them? I am using a Mac.
Related
I am running a loop to upload a csv file from my local machine, convert it to a h2o data frame, then run a h2o model. I then remove the h2o data frame from my r environment and the loop continues. These data frames are massive so I can only have one data frame loaded at a time (hence the reason for me removing the data frame from my environment).
My problem is that h2o creates temporary files which quickly max out my memory. I know I can restart my r session, but is there another way to flush this out in code so my loop can run happily? When I look at my task manager the all my memory is sucked up in Java(TM) Platform SE Binary.
Removing the object from the R session using rm(h2o_df) will eventually trigger garbage collection in R and the delete will be propagated to H2O. I don't think this is ideal, however.
The recommended way is to use h2o.rm or for your particular use case, it seems like h2o.removeAll would be the best (takes care of everything, models, data..).
I am working on placing a rpivotTable inside of a Shiny app. When I try on test data (a data frame with 1000 rows) I am able to run my app from the command line, and others may access the app given my ip:port as enjoyed. However, when I up the size of the data frame being fed into rpivotTable, the app will 'grey' out and I'm not able to serve the app to others.
I have also, successfully tested this same app, spinning up an EC2 instance, and upping the instance type, but the same thing would happen. I was getting a similar error as shown in this post ERROR: [on_request_read] connection reset by peer in R shiny and on this github issue https://github.com/rstudio/shiny/issues/1469. "ERROR: [on_request_read] connection reset by peer"
My syntax is pretty straightforward in terms of calling and rendering the rpivotTable, but as the size of the data frame increases, my app doesn't work. My suspicion is that this is a timeout parameter in the javascript widget?
I had the same problem, and had to upgrade from t3a.medium to t3a.large. That's more than I wanted, but it works now.
Every now and then I have to run a function that takes a lot of time and I need to interrupt the processing before it's complete. To do so, I click on the red sign of "stop" at the top of the console in Rstudio, which quite often returns this message below:
R is not responding to your request to interrupt processing so to stop the current operation you may need to terminate R entirely.
Terminating R will cause your R session to immediately abort. Active computations will be interrupted and unsaved source file changes and workspace objects will be discarded.
Do you want to terminate R now?
The problem is that I click "No" and then Rstudios seems to freeze completely. I would like to know if others face a similar issue and if there is any way to get around this.
Is there a way to stop a process in Rstudio quickly without loosing the objects in the workspace?
Unfortunately, RStudio is currently not able to interrupt R in a couple situations:
R is executing an external program (e.g. you cannot interrupt system("sleep 10")),
R is executing (for example) a C / C++ library call that doesn't provide R an opportunity to check for interrupts.
In such a case, the only option is to forcefully kill the R process -- hopefully this is something that could change in a future iteration of RStudio.
EDIT: RStudio v1.2 should now better handle interrupts in many of these contexts.
This could happen when R is not working within R and is invoking an external library call. The only option is to close the project window. Fortunately, unsaved changes including objects are retained on opening RStudio again.
Is it possible to initiate a command, when exiting an R session, similar to the commands in the .Rprofile file, but just on leaving the session.
I know of course, that a .RData file can be stored automatically, but since I am often switching machines, which might have different storage settings it would be easier to execute a custom save.image() command per session.
The help for q can give some hints. You can either create a function called .Last or register a finalizer on an environment to run on exit.
> reg.finalizer(.GlobalEnv,function(e){message("Bye Bye")},onexit=TRUE)
> q()
Save workspace image? [y/n/c]: n
Bye bye!
You can register the finalizer in your R startup (eg .RProfile) if you want it to be fairly permanent.
[edit: previously I registered the finalizer on a new environment, but that means keeping this object around and not removing it, because garbage collection would trigger the finalizer. As I've now written it the finalizer is hooked onto the Global Environment, which shouldn't get garbage collected during normal use).]
Out of the blue I started getting a "R Session Disconnected" message when opening the RStudio and even pressing the botton for Reconnect the following is a "R Session Aborted" error message followed by a new initiation of RStudio which enters in this loop of Session Disconnected.
In that first message, there is some indicative that "this browser was disconnected from the R because another browser connected". I did not make any change in my enviroment to justify such error. Already desinstalled and installed R and RStudio and those errors continue.
I´m using OneDrive to store some files, I don´t know if this may be associated with the problem but even in OneDrive I did not change anything that could be associated with.