Recently I closed an R.scrip and DONT save the changes (i.e. I lost almost all my work...).
I would like to know if there is some option to recovery back My previous version of that scrip
This script is not associated with any R.project
how to recovery unsaved R codes?
I found this similar question, but since my scarce knowledge in R I am not able to replicate. Could someone explain me with all possible details how (if is possible) recover my previous R.scrip.
I looked in the R folder but did not found nothing. Also * tried in "Documents* but same result (nothing).
I really appreciatte the help
Related
This isn't a major issue, but I still thought I would ask.
I've been cleaning some data for a project at work, and there's a point at the process where I save all of the individual files I've cleaned as a CSV in long format. I noticed that with some of the files that if I open them, some cells that SHOULD have data appear blank. If I use the "Clear All Formats" option, the data appears. It reads into R just fine and it hasn't caused any issues, but I still think it's weird.
Has anyone else run into this and if so, was there a way to resolve this without going through each column? The files I'm cleaning start out with all sorts of formatting, so I'm curious if that could be the cause. I thought that a CSV doesn't save formats though, so I'm a little confused.
Again, not the biggest deal but slightly annoying and I'll get questions about it if my colleagues ever take a look at these files.
The data is prorietary, and I'm not exactly sure how I would share it. but I'm using a pretty stragith forward write_csv(data,"path.csv")
I think I figured out the solution to this issue, and I wanted to share in case anyone else runs into this.
I'm using a Windows Computer, which needed an update. That got me thinking and I needed to update my version of RStudio. I'm not sure what would have caused this issue, but when I re-run those files, the issue appears to be resolved.
When closing R Studio at the end of a R session, I am asked via a dialog box: "Save workspace image to [working directory] ?"
What does that mean? If I choose to save the workspace image, where is it saved? I always choose not to save the workspace image, are there any disadvantages to save it?
I looked at stackoverflow but did not find posts explaining what does the question mean? I only find a question about how to disable the prompt (with no simple answers...): How to disable "Save workspace image?" prompt in R?
What does that mean?
It means that R saves a list of objects in your global environment (i.e. where your normal work happens) into a file. When R next loads, this list is by default restored (at least partially — there are cases where it won’t work).
A consequence is that restarting R does not give you a clean slate. Instead, your workspace is cluttered with existing stuff, which is generally not what you want. People then resort to all kinds of hacks to try to clean their workspace. But none of these hacks are reliable, and none are necessary if you simply don’t save/restore your workspace.
If I choose to save the workspace image, where is it saved?
R creates a (hidden) file called .RData in your current working directory.
I always choose not to save the workspace image, are there any disadvantages to save it?
The advantage is that, under some circumstances, you avoid recomputing results when you continue your work later. However, there are other, better ways of achieving this. On the flip side, starting R without a clean slate has many disadvantages: Any new analysis you now start won’t be in a clean room, and it won’t be reproducible when executed again.
So you are doing the right thing by not saving the workspace! It’s one of the rules of creating reproducible R code. For more information, I recommend Jenny Bryan’s article on using R with a Project-oriented workflow
But having to manually reject saving the workspace every time is annoying and error-prone. You can disable the dialog box in the RStudio options.
The workspace will include any of your saved objects e.g. dataframes, matrices, functions etc.
Saving it into your working directory will allow you to load this back in next time you open up RStudio so you can continue exactly where you left off. No real disadvantage if you can recreate everything from your script next time and if your script doesn't take a long time to run.
The only thing I have to add here is that you should consider seriously that some people may be working on ongoing projects, i.e. things that aren't accomplished in one day and thus must save their workspace image so as to not start from the beginning again.
I think, best practice is: its ok to save your workspace, but your code only really works if you can clear your entire workspace and then rerun it completely with no errors!
I have a rather simple question that I can't seem to find an answer for.
I essentially want to know if there is code that provide a status for the code we are running. As an example, I am running code that usually takes time and would like to know if the job is 25%-50%-75% complete. This is the kind of features we get with Software from large companies ($$).
Does anyone use anything when working with R?
I use Rstudio if this is pertinent.
Thanks
Say I enter the following commands into the console in Rstudio
x=seq(0,1e11, by=.01)
plot(x,sin(x))
Clearly this is a very silly thing to do. But my question is how do you terminate this process? I couldn't find this answered anywhere.
Attempted solutions: pressing ctrl+q, pressing esc, going to session->interupt R, going to session->terminate R. Nothing seems to work. This seems to be specific to plotting, for example if you run a stupidly large loop, most of the above options seem to work as expected.
Ideally I'd like a solution that doesn't lose the R script I have been working on in the console (as I haven't saved it in awhile).
Re-posting my comment as an answer since it seems to have solved your problem.
Save early, save often, cry less.
Try clicking the little red
stop icon above the console panel (unlikely to work if you've done
all that you've done already).
Try copy/pasting the script out to
a text editor.
Try killing just the rsession process through your
OS (might leave RStudio open). The good news is that RStudio is
often pretty smart about backing up working copies of scripts, so
you might find it's still there even if you have to kill the whole
program.
Entering dev.null() in the console will kill any plots and I find it's less likely to crash RStudio than some of the options given by Ari B. Friedman.
That said, save early save often is always sound advice.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Workflow for statistical analysis and report writing
I have been programming with R for not too long but am running into a project organization question that I was hoping somebody could give me some tips on. I am finding that a lot of the analysis I do is ad hoc: that is, I run something, think about the results, tweek it and run some more. This is conceptually different than in a language like C++ where you think about the entire thing you want to run before coding. It is a huge benefit of interpreted languages. However, the issue that comes up is I end up having a lot of .RData files that I save so I don't have to source my script every time. Does anyone have any good ideas about how to organize my project so I can return to it a month later and have a good idea of what each file is associated with?
This is sort of a documentation question I guess. Should I document my entire project at each leg and be vigorous about cleaning up files that will no longer be necessary but were a byproduct of the research? This is my current system but it is a bit cumbersome. Does anyone else have any other suggestions?
Per the comment below: One of the key things that I am trying to avoid is the proliferation of .R analysis files and .RData sets that go along with them.
Some thoughts on research project organisation here:
http://software-carpentry.org/4_0/data/mgmt/
the take-home message being:
Use Version Control for your programs
Use sensible directory names
Use Version Control for your metadata
Really, Version Control is a good thing.
My analysis is a knitr document, with some external .R files which are called from it.
All data is in a database, but during my analysis the processed data are saved as .RData files. Only when I delete the RData, they are recreated from the database when I run the analysis again. Kinda like a cache, saves database access and data processing time when I rerun (parts of) my analysis.
Using a knitr (Sweave, etc) document for the analysis enables you to easily write a documented workflow with the results included. And knitr caches the results of the analysis, so small changes do usually not result in a full rerun of all R code, but only of a small section. Saves quite some running time for a bigger analysis.
(Ah, and as said before: use version control. Another tip: working with knitr and version control is very easy with RStudio.)