export khrud object from kernelUD to raster - r

In R, how can I export a khrud object from function kernelUD in package adehabitat to a raster file (geoTiff)?
I tried following this thread (R: how to create raster layer from an estUDm object) using the code here:
writeRaster(raster(as(udbis1,"SpatialPixelsDataFrame")), "udbis1.tif")
where udbis1 is a khrud object, but I get "Error in as(udbis1, "SpatialPixelsDataFrame") : no method or default for coercing “khrud” to “SpatialPixelsDataFrame."
I think the issue may be that the old thread was before an update to the adehabitat package changed the data format from estUD to khrud. Maybe?

You do not provide a reproducible example. The following works for me:
library(adehabitatHR)
library(raster)
data(puechabonsp)
loc <- puechabonsp$relocs
ud <- kernelUD(loc[, 1])
r <- raster(as(ud[[1]], "SpatialPixelsDataFrame"))
writeRaster(r, filename = file.path(tempdir(), "ud1.tif"))

AdehabitatHR solutions work well for data that are in the required format or when using multiple animals. Though when wanting to create KDE with data organized differently or for only one source, it can be frustrating. For some reason, #johaness' answer doesn't work for my case so here is an alternative solution that avoids the headaches of going into adehabitatHR's innards.
library(adehabitatHR)
library(raster)
# Recreating an example for only one animal
# with a basic xy dataset like one would get from tracking
loc<-puechabonsp$relocs
loc<-as.data.frame(loc)
loc<-loc[loc$Name=="Brock",]
coordinates(loc)<-~X+Y
ud<-kernelUD(loc)
# Extract the UD values and coordinates into a data frame
udval<-data.frame("value" = ud$ud, "lon" = ud#coords[,1], "lat" = ud#coords[,2])
coordinates(udval)<-~lon+lat
# coerce to SpatialPixelsDataFrame
gridded(udval) <- TRUE
# coerce to raster
udr <- raster(udval)
plot(udr)

Related

How to streamline and speed up loop with getData package in R

I am trying to download high-resolution climate data for a bunch of lat/long coordinates, and combine them into a single dataframe. I've come up with a solution (below), but it will take forever with the large list of coordinates I have. I asked a related question on the GIS StackExchange to see if anyone knew of a better approach for downloading and merging the data, but I'm wondering if I could somehow just speed up the operation of the loop? Does anyone have any suggestions on how I might do that? Here is a reproducible example:
# Download and merge 0.5 minute MAT/MAP data from WorldClim for a list of lon/lat coordinates
# This is based on https://emilypiche.github.io/BIO381/raster.html
# Make a dataframe with coordinates
coords <- data.frame(Lon = c(-83.63, 149.12), Lat=c(10.39,-35.31))
# Load package
library(raster)
# Make an empty dataframe for dumping data into
coords3 <- data.frame(Lon=integer(), Lat=integer(), MAT_10=integer(), MAP_MM=integer())
# Get WorldClim data for all the coordinates, and dump into coords 3
for(i in seq_along(coords$Lon)) {
r <- getData("worldclim", var="bio", res=0.5, lon=coords[i,1], lat=coords[i,2]) # Download the tile containing the lat/lon
r <- r[[c(1,12)]] # Reduce the layers in the RasterStack to just the variables we want to look at (MAT*10 and MAP_mm)
names(r) <- c("MAT_10", "MAP_mm") # Rename the columns to something intelligible
points <- SpatialPoints(na.omit(coords[i,1:2]), proj4string = r#crs) #give lon,lat to SpatialPoints
values <- extract(r,points)
coords2 <- cbind.data.frame(coords[i,1:2],values)
coords3 <- rbind(coords3, coords2)
}
# Convert MAT*10 from WorldClim into MAT in Celcius
coords3$MAT_C <- coords3$MAT_10/10
Edit: Thanks to advice from Dave2e, I've first made a list, then put intermediate results in the list, and rbind it at the end. I haven't timed this yet to see how much faster it is than my original solution. If anyone has further suggestions on how to improve the speed, I'm all ears! Here is the new version:
coordsList <- list()
for(i in seq_along(coordinates$lon_stm)) {
r <- getData("worldclim", var="bio", res=0.5, lon=coordinates[i,7], lat=coordinates[i,6]) # Download the tile containing the lat/lon
r <- r[[c(1,12)]] # Reduce the layers in the RasterStack to just the variables we want to look at (MAT*10 and MAP_mm)
names(r) <- c("MAT_10", "MAP_mm") # Rename the columns to something intelligible
points <- SpatialPoints(na.omit(coordinates[i,7:6]), proj4string = r#crs) #give lon,lat to SpatialPoints
values <- extract(r,points)
coordsList[[i]] <- cbind.data.frame(coordinates[i,7:6],values)
}
coords_new <- bind_rows(coordsList)
Edit2: I used system.time() to time the execution of both of the above approaches. When I did the timing, I had already downloaded all of the data, so the download time isn't included in my time estimates. My first approach took 45.01 minutes, and the revised approach took 44.15 minutes, so I'm not really seeing a substantial time savings by doing it the latter way. Still open to advice on how to revise the code so I can improve the speed of the operations!

Convert SpatialCollections to SpatialPolygonsDataFrame in R

I am struggling to convert an object of class SpatialCollections to a SpatialPolygonsDataFrame object.
My input files are both shapefiles and SpatialPolygonsDataFrame objects. They can be accessed here.
I do an intersection of both objects:
SPDF_A <- shapefile("SPDF_A")
SPDF_B <- shapefile("SPDF_B")
intersection <- gIntersection(gBuffer(SPDF_A, width=0), gBuffer(SPDF_B, width=0))
The result is:
> intersection
class : SpatialCollections
Setting gBuffer(... , byid=T) or gBuffer(... , byid=F) seems to make no difference.
I use gIntersection and gBuffer(... , width=0) insetead of intersect in order to avoid geometrical problems (Self-intersection).
This is part of a larger loop. I need to get the intersection as SpatialPolygonsDataFrame because it will be saved as shp file in a following step.
writeOGR(intersection, ".", layer=paste0("Int_SPDF_A-SPDF_B"), driver="ESRI Shapefile")
This is not possible from a SpatialCollections object. In order to convert this to a SpatialPolygonsDataFrame I tried:
intersection <- as(intersection ,"SpatialPolygonsDataFrame")
intersection <- SpatialPolygonsDataFrame(intersection)
intersection <- readOGR(intersection, layer = "intersection")
Nothing works. Does anybody have a solution? Thanks a lot!
First of all, according to the documentation SpatialCollections is kind of a container format that can "hold SpatialPoints, SpatialLines, SpatialRings, and SpatialPolygons (without attributes)". If you need the data frame part of your SpatialPolygonsDataFrame ("attribute table" in GIS language), you'll have to work around that somehow. If, on the other hand, you're only interested in the spatial information (the polygons without the data attached to them) try the following:
str(intersection, max.level = 3)
suggests that the #polyobj is nothing but a SpatialPolygons object. Hence
mySpoly <- intersection#polyobj
should do the trick and
class(mySpoly)
suggests that we indeed now have a SpatialPolygons.
You need to convert that to a SpatialPolygonsDataFrame before exporting:
mySpolyData <- as(mySpoly, "SpatialPolygonsDataFrame")
writeOGR(mySpolyData, ".", layer=paste0("Int_SPDF_A-SPDF_B"), driver="ESRI Shapefile")

Spatial Polygon sampling error in R

I have a shape file of 200 counties. How should I sample in order to subset counties from the existing 200? I have tried using the below R code:
library(maptools)
TXcounties <- readShapePoly("C:/Users/Rvg296/Downloads/TXCountiesShapeFiles/TXCounties.shp")
idx <- sample(1:250, 25, replace = FALSE)
df.TXcounties <- as.data.frame(TXcounties)
SpatialPolygonsDataFrame(idx, df.TXcounties).
But this is throwing an error like:
Error in SpatialPolygonsDataFrame(idx, df.TXcounties) : trying to get slot "polygons" from an object of a basic class ("integer") with no slots
The problem is that you are using idx, an integer vector, as the first argument for SpatialPolygonsDataFrame(), but this function needs a spatial polygons object as its first argument. In any case, you should be able to do the whole thing a lot more easily with something like this:
result <- TXcounties[idx,]

Saving R objects not workspaces

i am completely new in R.
I am trying to save a spatialdataframe and a normal data frame within the same object. when I am applying the code below, it saves the object as a R workspace, is it normal? i mean I wnated to obtain a .rda data instead. What i specifically want to do is to obtain an R data with those two objects. I want the spatialdataframe to keep its spatial charactheristics. Can someone help me?
##import a text table
mcvfinal<-read.csv("dataCPWithAge.csv",header=TRUE,sep=",",dec=".")
##reading the shapefile
library(rgdal) polypc1 <- readOGR(".", "CP3poly_Matchingshp")
##saving the two frames into the same object
save(mcvfinal,polypc1,file="polypc.Rdata")
Try:
saveRDS(list(mcvfinal,polypc1),file="polypc.rds")
Load:
foo = readRDS("polypc.rds")
# mcvfinal is foo[[1]]
# polypc1 is foo[[2]]

How to convert .shp file into .csv in R?

I want to to convert two .shp files into one database that would allow me to draw the maps together.
Also, is there a way to convert .shp files into .csv files? I want to be able to personalize and add some data which is easier for me under a .csv format. What I have in mind if to add overlay yield data and precipitation data on the maps.
Here are the shapefiles for Morocco, and Western Sahara.
Code to plot the two files:
# This is code for mapping of CGE_Morocco results
# Loading administrative coordinates for Morocco maps
library(sp)
library(maptools)
library(mapdata)
# Loading shape files
Mor <- readShapeSpatial("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/MAR_adm1.shp")
Sah <- readShapeSpatial("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/ESH_adm1.shp")
# Ploting the maps (raw)
png("Morocco.png")
Morocco <- readShapePoly("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/MAR_adm1.shp")
plot(Morocco)
dev.off()
png("WesternSahara.png")
WesternSahara <- readShapePoly("F:/Purdue University/RA_Position/PhD_ResearchandDissert/PhD_Draft/Country-CGE/ESH_adm1.shp")
plot(WesternSahara)
dev.off()
After looking into suggestions from #AriBFriedman and #PaulHiemstra and subsequently figuring out how to merge .shp files, I have managed to produce the following map using the following code and data (For .shp data, cf. links above)
code:
# Merging Mor and Sah .shp files into one .shp file
MoroccoData <- rbind(Mor#data,Sah#data) # First, 'stack' the attribute list rows using rbind()
MoroccoPolys <- c(Mor#polygons,Sah#polygons) # Next, combine the two polygon lists into a single list using c()
summary(MoroccoData)
summary(MoroccoPolys)
offset <- length(MoroccoPolys) # Next, generate a new polygon ID for the new SpatialPolygonDataFrame object
browser()
for (i in 1: offset)
{
sNew = as.character(i)
MoroccoPolys[[i]]#ID = sNew
}
ID <- c(as.character(1:length(MoroccoPolys))) # Create an identical ID field and append it to the merged Data component
MoroccoDataWithID <- cbind(ID,MoroccoData)
MoroccoPolysSP <- SpatialPolygons(MoroccoPolys,proj4string=CRS(proj4string(Sah))) # Promote the merged list to a SpatialPolygons data object
Morocco <- SpatialPolygonsDataFrame(MoroccoPolysSP,data = MoroccoDataWithID,match.ID = FALSE) # Combine the merged Data and Polygon components into a new SpatialPolygonsDataFrame.
Morocco#data$id <- rownames(Morocco#data)
Morocco.fort <- fortify(Morocco, region='id')
Morocco.fort <- Morocco.fort[order(Morocco.fort$order), ]
MoroccoMap <- ggplot(data=Morocco.fort, aes(long, lat, group=group)) +
geom_polygon(colour='black',fill='white') +
theme_bw()
Results:
New Question:
1- How to eliminate the boundaries data that cuts though the map in half?
2- How to combine different regions within a .shp file?
Thanks you all.
P.S: the community in stackoverflow.com is wonderful and very helpful, and especially toward beginners like :) Just thought of emphasizing it.
Once you have loaded your shapefiles into Spatial{Lines/Polygons}DataFrames (classes from the sp-package), you can use the fortify generic function to transform them to flat data.frame format. The specific functions for the fortify generic are included in the ggplot2 package, so you'll need to load that first. A code example:
library(ggplot2)
polygon_dataframe = fortify(polygon_spdf)
where polygon_spdf is a SpatialPolygonsDataFrame. A similar approach works for SpatialLinesDataFrame's.
The difference between my solution and that of #AriBFriedman is that mine includes the x and y coordinates of the polygons/lines, in addition to the data associated to those polgons/lines. I really like visualising my spatial data with the ggplot2 package.
Once you have your data in a normal data.frame you can simply use write.csv to generate a csv file on disk.
I think you mean you want the associated data.frame from each?
If so, it can be accessed with the # slot access function. The slot is called data:
write.csv( WesternSahara#data, file="/home/wherever/myWesternSahara.csv")
Then when you read it back in with read.csv, you can try assigning:
myEdits <- read.csv("/home/wherever/myWesternSahara_modified.csv")
WesternSahara#data <- myEdits
You may need to do some massaging of row names and so forth to get it to accept the new data.frame as valid. I'd probably try to merge the existing data.frame with a csv you read in in R, rather than making edits destructively....

Resources