I don't know how to deal with save.image()and saveRDS()with raster data in R. I have understood that raster package open a connexion with the image file using raster() function, so it doesn't really open the file into R workspace.
I want to save my workspace (data.frame, list, raster, etc) with save.image() function (or similar) and open it in a different computer. If I try to plot or process a raster object saved in a different computer, always have the same issue:
Error in .local(.Object, ...) :
`C:\path\to\file.tif' does not exist in the file system,
and is not recognised as a supported dataset name.
Is there a way to save a raster object (opened as external file) in R format? I don't mean raster format as tiff nor grid and others.
At your own risk, you can use the readAll function to load the raster into memory before saving. e.g.
r <- raster(system.file("external/test.grd", package="raster"))
r <- readAll(r) # force data into memory
save(r, file = 'r.RData')
It can be loaded on a different machine as mentioned
load('r.Rdata`)
Beware, this will be problematic for very large rasters on memory limited systems
You can save rasters, like other R objects, using the save command.
save(r,file="r.Rdata")
On a different computer, you can load that file using
load("r.Rdata")
which will bring back the raster r in your workspace.
I have tried this across Windows and Linux and it never gives problems
Related
I am running a species distribution model using R in a server. I am saving all my R environment, however when I try to visualize and plot the objects saved I get an error "Error in file(fn, "rb") : cannot open the connection In addition: Warning message: In file(fn, "rb") : cannot open file '/localscratch/anandam.9761522.0/RtmpnCnH0y/raster/r_tmp_2020-07-13_195526_260024_04625.gri': No such file or directory".
It seems that some important information was saved in a temporary directory and I cannot access it after the analysis is done. Is that right?
One possible solution seems to use the raster::readAll (Read all values from a raster file associated with a Raster* object into memory). However, when I use this, I am getting "Error: cannot allocate vector of size 12.5Gb". I tried to extend the limit of Memory allocated to R in the cluster, but it seems the memory does not change. When I do the same using my MacBook Pro it works using memory_limit{ulimit}. Is there another way to increase the memory used in R for linux OS? Or, is there another way to save all my R objects without using a temporary directory, so I can recover everything after the analysis is finished?
#LIST the tif files with selected predictors
predictors1 <- list.files(path="/home/.../predictors_test",pattern =".tif", full.names = TRUE)
#RASTER the objects from the list
predictors2 <- lapply(predictors1, raster)
#STACK predictors
Predictors <- stack(predictors2)
#Save all the information of Predictors raster object
Predictors <- readAll(Predictors)
You say you are "saving all my R environment" --- That is a nice shortcut, that generally works, although I think it is almost always a bad idea. Things are much clearer if instead you explicitly save the data you need to keep in files; and read them again as needed. In most cases you can use saveRDS/readRDS.
You cannot use saveRDS for Raster* objects that point to a file in the temp folder. All you would be saving is an object that points to a file that disappeared when your session ended (and that is why saving sessions does not work either).
The best approach is to avoid these temp files by using the filename= argument that most functions in the raster package provide; or at least for the last processing step. Alternatively, you can use writeRaster to save what you want to keep after you are done processing.
I would like to create a RasterLayer from GeoTIFF data provided by a server. I'll query the server for this data using a httr::GET call (the data is provided on-demand, so in the application there won't be a url ending in .tif but a query url).
After writing the result of this call to disk as a GeoTIFF file it's easy enough to create the RasterLayer from the resulting GeoTIFF file on disk:
library(httr)
library(raster)
url <- 'http://download.osgeo.org/geotiff/samples/gdal_eg/cea.tif'
geotiff_file <- tempfile(fileext='.tif')
httr::GET(url,httr::write_disk(path=geotiff_file))
my_raster <- raster(geotiff_file)
my_raster
However, I would like to skip the write to disk part and create the raster straight from the in-memory server response.
response <- httr::GET(url,httr::write_memory())
response
The content of the response is a raw string which I would need to interpret as geoTIFF data.
str(httr::content(response))
However, I can only find raster or rgdal functions to read from a file. Any suggestions on translating this raw string to a raster?
Thanks!
GDAL has some cool virtual file system driver, one of which is /vsicurl that
allows on-the-fly random reading of files available through HTTP/FTP web protocols, without prior download of the entire file. It requires GDAL to be built against libcurl.
Since the raster package builds on rgdal you can simply do this:
library(raster)
r <- raster('/vsicurl/http://download.osgeo.org/geotiff/samples/gdal_eg/cea.tif')
plot(r)
For my processes in R I want to read in a 20 gigabyte file. I got it in a XML file type.
In R I cannot load it in with readOGR since it is to big. It gives me the error cannot allocate vector 99.8 mb.
Since my file is to big the logical next step in my mind would be to split the file. But since I can not open it in R and any other GIS package at hand, I can not split the file before I load it in. I am already using the best PC to my availability.
Is there a solution?
UPDATE BECAUSE OF COMMENT
If I use head() my line looks like underneath. It does not work unfortunately.
headfive <- head(readOGR('file.xml', layer = 'layername'),5)
I'm new with R in QGIS, I could write a simple script, and I want to obtain the table resulting, the table which R uses to create the plot graphics.
How can I do that?
This is the script:
##Point pattern analysis=group
##Layer=vector
##Titulo=string
##showplots
library("maptools")
library("spatstat")
K <- Kest(as.ppp(Layer))
plot(K, main=Titulo)
Can anyone help me?
The QGIS processing module runs each R script in a separate R session. If you want to save anything created then you need to save it to a file in your script, for example:
save(K,file="K.RData")
Then in another R session you can do:
load("K.RData")
library(spatstat)
and now K is restored.
You might want to pass the save file name as another parameter to your processing script, or you may want to not do this further work in QGIS...
If you want to save this as a DBF file then there's a problem caused by the fact that K is a special kind of data frame - use write.dbf(as.data.frame(K),"/path/to/K.dbf") to convert it to a plain data frame for writing. This will lose some of the information such as labels and names of the various components but you can't store irregular data in a DBF.
I've got a netCDF data file.The file can be downloaded at:
http://www.nodc.noaa.gov/OC5/WOA09/pr_woa09.html
It's the one called WOA09 netCDF version.
I want to use only climatology data (variable 4) and the first depth range [ , ,1] so I've used the code below so far in R. In now want to have the subset called MyData in RData format.
I want to convert it to RData to be able to play around with it in R. I haven't found anything on the internet about doing this, is it even possible? How?
Thank you so much if you can help! And let me know if I haven't given enough info .
library(ncdf)
MyFile<-open.ncdf("/home/et1211/wspd.mean.nc")
MyFile$var[[4]]->var4
MyData<-get.var.ncdf(MyFile,var4)
MyData<-MyData[,,1]
It's simple. You just save the object from R and use .RData as extension:
save(MyData, file="myNCDFdata.RData")
Or else you can read the ncdf data to an empty workspace, do whatever transformations you need, and then quit R and click ok to save workspace.