I generate a report in Rmarkdown, where I use the option
cache=TRUE
in order to save the simulated data frames generated in different chrunks. But when I go to the htlm folder where the data is stored, even though I see the .Rdata files and their corresponding sizes, which makes sense (about 3.1 kb), when I load the files on Rstudio, then there is nothing, the global enviroment remains empty.
I have really not idea why I cannot see the data.frames, any hint is appreciated.
Related
Sir,i am a student, learning R,I have a question about how to store data in R, or how to retrieve data that has been erased.
Sir,
Using RStudio is not much different than using, say, Word or Notepad, but with some differences.
First the similarities:
If you do not save your Rscript or data, it might not be available after you restart RStudio or if you overwrite/erase your data.
The advantage of using R and Rstudio is that you can script how you load and manipulate your data, hence recreate the data. If you use a script and do not rely only on the console (interactive) part.
For the differences, Rstudio can be set to save your current workspace. This is were all data and variables loaded reside. To change the settings, go to "Tools" --> "Global options" and you should see the options as depicted below.
However, if you erase your data by overwriting with other values or using the command unset, the data is lost. Your only recourse is to retrace how it was loaded/modified, using either your script or going through the "history".
For saving data, see e.g. http://www.sthda.com/english/wiki/saving-data-into-r-data-format-rds-and-rdata. Note the difference between save and saveRDS where the former saves data with their variable names, whereas saveRDS saves the data without and must be loaded into a variable.
I have a list whose elements are several dataframes, which looks like this
Because it is hard for another user to use these data by re-running my original code. Hence, I would like to export it. As the graph shows, the dataframes in that list have different number of rows. I am wondering if there is any method to export it as file without damaging any information, and make it be able to be used by Rstudio. I have tried to save it as RData, but I don't know how to save the information.
Thanks a lot
To output objects in R, here are 4 common methods:
dput() writes a text representation of an R object
This is very convenient if you want to allow someone to get your object by copying and pasting text (for instance on this site), without having to email or upload and download a file. The downside however is that the output is long and re-reading the object into R (simply by assigning the copied text to an object) can hang R for large objects. This works best to create reproducible examples. For a list of data frames, this would not be a very good option.
You can print an object to a .csv, .xlsx, etc. file with write.table(), write.csv(), readr::write_csv(), xlsx::write.xlsx(), etc.
While the file can then be used by other software (and re-imported into R with read.csv(), readr::read_csv(), readxl::read_excel(), etc.), the data can be transformed in the process and some objects cannot be printed in a single file without prior modifications. So this is not ideal in your case either.
save.image() saves your entire workspace (objects + environment)
The workspace can then be recreated with load(). This can be useful, but you are here only interested in saving one object. In that case, it is preferable to use:
saveRDS() which allows to write one object to file
The object can then be re-created with readRDS(). This is the best option to save an R object to file, without any modification and then re-create it.
In your situation, this is definitely the best solution.
I am trying to find out how I can see the data within a dataset with a .RData extension.
I tried view(), it gave me one object present in the dataset but I know that this dataset is a large dataset with over 300MB size and consists of a very large number of names list. I need to view all of the contents of it and have been unsuccessful so far.
Should I convert it into a CSV instead in order to view all of the contents? If yes, how can I do that using RStudio?
The cross-platform function is View. (Caps are discriminatory in R.) If you did:
obj <- load("filename.Rdata") # assuming a file exist in your working directory
Then type:
obj
You should see a print-listing of the character representations of the objects created (or possibly overwritten) in your global environment. The Rstudio aspect of this question would not affect the result.
I have many projects where I'm required to produce pdf images, and this goes into git and svn repositories.
However, when a pdf is generated in R, it has a different checksum every time. Same happens with creating excel sheets with write.xlsx. So the repositories become cluttered with "changes" which are not real changes.
I imagine that some metadata is added (maybe a timestamp?). Is there a way to strip this from the pdf so that every time I re-generate them, the checksum remains the same?
Found a solution for pdfs: instead of using pdf(), use cairo_pdf(). The md5sum is stable.
So I am having an R nightmare. I've returned to a project I built under the previous iteration (or perhaps one more) of RStudio. I produced a workable report that I was asked to update, and my current bugbear wasn't around then. Here is what happens:
My report file is "ISS Time Series.Rmd". It calls three other files:
"mystyles.sty", which updates the LaTeX preamble to use some additional packages.
"functions.R" and "load.R". The former contains frequently used functions I've written, and the latter loads the data I'm using.
I source the two R functions in the .Rmd file. When I try to Knit the report, whether I get an error or am successful, my two .R files and my one .sty file are deleted. And not just deleted -- gone for good.
I do not know what is up. I have ruined my previous work simply by returning to examine the original file.
Please, somebody has to help me here. My workflow is shot to hell if I have to write every last bit of code over and over again in each report.
UPDATE: Even copying the files to another directory doesn't help.
Here is the code block that calls the "load.R" file:
```{r loaddata}
#
# ------- Load Data
#
# This section loads the ISS survey files one at a time and saves them as
# read.SPSS objects within a list. It names these eleven objects as "ISS 2002",
# "ISS 2003", etc... until "ISS 2012". This file may be prohibitively large.
#
source("load.R") # Loads the ISS Survey files
```
Rename your file to ISS_Time_Series.Rmd and try again.
It is the spaces in the document name that makes rmarkdown::render() delete the files that have been loaded or sourced.
A an issue has already been filed. See https://github.com/rstudio/rmarkdown/issues/580