Convert SpatialCollections to SpatialPolygonsDataFrame in R - r

I am struggling to convert an object of class SpatialCollections to a SpatialPolygonsDataFrame object.
My input files are both shapefiles and SpatialPolygonsDataFrame objects. They can be accessed here.
I do an intersection of both objects:
SPDF_A <- shapefile("SPDF_A")
SPDF_B <- shapefile("SPDF_B")
intersection <- gIntersection(gBuffer(SPDF_A, width=0), gBuffer(SPDF_B, width=0))
The result is:
> intersection
class : SpatialCollections
Setting gBuffer(... , byid=T) or gBuffer(... , byid=F) seems to make no difference.
I use gIntersection and gBuffer(... , width=0) insetead of intersect in order to avoid geometrical problems (Self-intersection).
This is part of a larger loop. I need to get the intersection as SpatialPolygonsDataFrame because it will be saved as shp file in a following step.
writeOGR(intersection, ".", layer=paste0("Int_SPDF_A-SPDF_B"), driver="ESRI Shapefile")
This is not possible from a SpatialCollections object. In order to convert this to a SpatialPolygonsDataFrame I tried:
intersection <- as(intersection ,"SpatialPolygonsDataFrame")
intersection <- SpatialPolygonsDataFrame(intersection)
intersection <- readOGR(intersection, layer = "intersection")
Nothing works. Does anybody have a solution? Thanks a lot!

First of all, according to the documentation SpatialCollections is kind of a container format that can "hold SpatialPoints, SpatialLines, SpatialRings, and SpatialPolygons (without attributes)". If you need the data frame part of your SpatialPolygonsDataFrame ("attribute table" in GIS language), you'll have to work around that somehow. If, on the other hand, you're only interested in the spatial information (the polygons without the data attached to them) try the following:
str(intersection, max.level = 3)
suggests that the #polyobj is nothing but a SpatialPolygons object. Hence
mySpoly <- intersection#polyobj
should do the trick and
class(mySpoly)
suggests that we indeed now have a SpatialPolygons.
You need to convert that to a SpatialPolygonsDataFrame before exporting:
mySpolyData <- as(mySpoly, "SpatialPolygonsDataFrame")
writeOGR(mySpolyData, ".", layer=paste0("Int_SPDF_A-SPDF_B"), driver="ESRI Shapefile")

Related

Points not added to the map plot

I need to add some points to the map using simple points function. The issue is that points don't add to the map. It's simple command, I follow some tutorial where adding points to the map works this way but not in my case. Plot function plots Texas choropleth properly but next line (points) doesn't add points to the map at all:
library(rgdal)
library(rgeos)
library(sp)
companies <- read.csv('geoloc_data_comp.csv', header = T, dec = ',', sep = ';')
states <- readOGR('.', 'states')
plot(states[states#data$stat_name == 'texas',])
points(companies$coords.x1, companies$coords.x2, pch = 21)
First you shoud start to avoid rgeos/rgdal because they will stop being maintains. See : https://github.com/r-spatial/evolution
sf is replacing them:
library(sp)
library(sf)
library(spData) #used because I wanted US states
# list of data in spData you have one with US states
data(package = "spData")
if you want to read shapefile or other GIS format check sf::st_read() (instead of readOGR())
# one way with sf
plot(us_states$geometry[us_states$NAME == "Texas"])
# if you want do use the sp way
us_sp <- as(us_states, "Spatial") # convert to sp
plot(us_sp[us_sp#data$NAME == "Texas",])
with sf you have the geometry in one column (see "geometry") instead of having an R S4 with nested lists (see #data and #polygones).
Before getting some points we need to check in which CRS our data are. If you do not know CRS I like this website : https://ihatecoordinatesystems.com/
You also have information in the us_states documentation: https://www.rdocumentation.org/packages/spData/versions/2.0.1/topics/us_states
Then you can use:
sp::proj4string(us_sp)
sf::st_crs(us_states)
# This is EPSG 4269 or NAD83
If you want to use points() they need to be in this coordinates system (I suspect this explain your trouble ie different CRS).
You didn't provide data points so I produced some:
library(osmdata)
#this will just download node matching the key/value place=city
some_city_in_texas <- osmdata::opq(osmdata::getbb("Texas US"),
nodes_only = TRUE) %>%
osmdata::add_osm_feature(key = "place", value = "city") %>%
osmdata::osmdata_sf() #keep them in sf format
# osmdata_sp() also exist
The class osmdata is a bit complicated but here you just need to know that some_city_in_texas$osm_points provide us with points (to test points()). Now we can check their CRS:
sf::st_crs(some_city_in_texas$osm_points)
As you can see we are in an other CRS so we need to transform it. (you will probably need to do it).
city_in_texas <- sf::st_transform(some_city_in_texas$osm_points,
4269)
sf use simple feature standard to store localization and points() want two vectors x&y. You should also check that (common cause of error): R use x/y (long/lat) and not lat/long.
Here we convert city_in_texas to just coords. (if you need to do the reverse, ie converting data frame with X/Y, into an sf object look at sf::st_as_sf())
coords_city <- sf::st_coordinates(city_in_texas)
Finally this works fine now:
plot(us_states$geometry[us_states$NAME == "Texas"])
points(coords_city, pch = 21)
Good ressources are https://r-spatial.org/ and https://geocompr.robinlovelace.net/

Create a vector footprint of a multi-part raster [R]

I've read a raster into my R session, using this code:
raster <- stack("raster.tif")
and now I'd like to make a simple feature (sf) object representing the outline of that raster. I can't use a bounding box because the raster is multi-part so the bounding box would be much larger than the raster. So the footprint also needs to be a multi-part feature (sf multipolygon).
I'd appreciate any help with this. Thanks!
Mark
If you want each raster in the stack you need to loop over each one with lapply. This will return a list of polygon layers. You then need to convert each component of the list to an sf multipolygon. Lastly, you need to concatenate the features (note that c is the c() function). shp should be your multipolygon. You may not want to dissolve the polygons, you didn't really make it clear what you wanted.
a <- lapply(as.list(raster), rasterToPolygons, dissolve=TRUE)
b <- lapply(a, st_as_sf) # convert to sf multipolygon
shp <- Reduce(c, b) # combine all polygons to one
As a side note, it's probably not great to use raster as a variable name because the raster package has a function called raster.

How to subset a large SpatialPolygonsDataFrame

I want to calculate the area of a wildfire. I tried this by substracting the NDVI calculated on a Landsat image before and another image after the fire and see where the NDVI was reduced. However, not only in the burning areas the NDVI has changed, but there are also many random differences. I used rasterToPolygons to create a large SpatialPolygonsDataFrame containing all areas where NDVI after - NDVI before < 0.
Now I want to remove all the polygons with an area below a certain threshold value. However, I cannot find a way to subset the large SpatialPolygonsDataFrame.
I found an example on how to get a list of the polygons with an area above the threshold (where burned_poly is the large SpatialPolygonsDataFrame):
pols <- lapply(burned_poly#polygons , slot , "Polygons")
pols_areas <- lapply(pols[[2]], function(x) slot(x, "area"))
However, accessing the large SpatialPolygonsDataFrame like this
bp <- burned_poly#polygons[[1]]#Polygons[pols_areas >= 9000]
gives me a list which I am currently unable to coerce into a SpatialPolygonsDataFrame.
Can someone tell me how to do this last step (I have trouble with the Sf argument of which I don't know what it is in the SpatialPolygonsDataFrame function), or maybe there is a different and better approach to extract the fire extent as a polygon?
Alright, I think I have found a way thanks to Orlandos suggestion to use sf.
I transformed my large SpatialPolygonsDataFrame object to a sf object via st_as_sf() which gave me a multipolygon. This stf_MULTIPOLYGON object can be subdivided into single polygons using st_cast() and the resulting object is subsettable like a data.frame.
bp_sf <- st_as_sf(burned_poly)
bps_sf <- st_cast(bp_sf, "POLYGON")
BpSf <- bps_sf[as.numeric(st_area(bps_sf))>=10000,]
If you are using the simple features sf library you can use functions from the tidyverse. Filtering data is a matter of using the filter() function. Notice that you can convert your objects to sf using st_as_sf(). See: https://r-spatial.github.io/sf/reference/st_as_sf.html and How to filter an R simple features collection using sf methods like st_intersects()?

Extract raster by a list of SpatialPolygonsDataFrame objects in R

I am trying to extract summed raster cell values from a single big file for various SpatialPolygonsDataFrames (SPDF) objects in R stored in a list, then add the extracted values to the SPDF objects attribute tables. I would like to iterate this process, and have no idea how to do so. I have found an efficient solution for multiple polygons stored in a single SPDF object (see: https://gis.stackexchange.com/questions/130522/increasing-speed-of-crop-mask-extract-raster-by-many-polygons-in-r), but do not know how to apply the crop>mask>extract procedure to a LIST of SPDF objects, each containing multiple polygons. Here is a reproducible example:
library(maptools) ## For wrld_simpl
library(raster)
## Example SpatialPolygonsDataFrame
data(wrld_simpl) #polygon of world countries
bound <- wrld_simpl[1:25,] #country subset 1
bound2 <- wrld_simpl[26:36,] #subset 2
## Example RasterLayer
c <- raster(nrow=2e3, ncol=2e3, crs=proj4string(wrld_simpl), xmn=-180,
xmx=180, ymn=-90, ymx=90)
c[] <- 1:length(c)
#plot, so you can see it
plot(c)
plot(bound, add=TRUE)
plot(bound2, add=TRUE, col=3)
#make list of two SPDF objects
boundl<-list()
boundl[[1]]<-bound1
boundl[[2]]<-bound2
#confirm creation of SPDF list
boundl
The following is what I would like to run for the entire list, in a forloop format. For a single SPDF from the list, the following series of functions seem to work:
clip1 <- crop(c, extent(boundl[[1]])) #crops the raster to the extent of the polygon, I do this first because it speeds the mask up
clip2 <- mask(clip1, boundl[[1]]) #crops the raster to the polygon boundary
extract_clip <- extract(clip2, boundl[[1]], fun=sum)
#add column + extracted raster values to polygon dataframe
boundl[[1]]#data["newcolumn"] = extract_clip
But when I try to isolate the first function for the SPDF list (raster::crop), it does not return a raster object:
crop1 <- crop(c, extent(boundl[[1]])) #correctly returns object class 'RasterLayer'
cropl <- lapply(boundl, crop, c, extent(boundl)) #incorrectly returns objects of class 'SpatialPolygonsDataFrame'
When I try to isolate the mask function for the SPDF list (raster::mask), it returns an error:
maskl <- lapply(boundl, mask, c)
#Error in (function (classes, fdef, mtable) : unable to find an inherited method for function ‘mask’ for signature ‘"SpatialPolygonsDataFrame", "RasterLayer"’
I would like to correct these errors, and efficiently iterate the entire procedure within a single loop (i.e., crop>mask>extract>add extracted values to SPDF attribute tables. I am really new to R and don't know where to go from here. Please help!
One approach is to take what is working and simply put the desired "crop -> mask -> extract -> add" into a for loop:
for(i in seq_along(boundl)) {
clip1 <- crop(c, extent(boundl[[i]]))
clip2 <- mask(clip1, boundl[[i]])
extract_clip <- extract(clip2, boundl[[i]], fun=sum)
boundl[[i]]#data["newcolumn"] <- extract_clip
}
One can speed-up the loop with parallel execution, e.g., with the R package foreach. Conversely, the speed gain of using lapply() instead of the for loop will be small.
Why the error occurs:
cropl <- lapply(boundl, crop, c, extent(boundl))
applies the function crop() to each element of the list boundl. The performed operation is
tmp <- crop(boundl[[1]], c)
## test if equal to first element
all.equal(cropl[[1]], tmp)
[1] TRUE
To get the desired result use
cropl <- lapply(boundl, function(x, c) crop(c, extent(x)), c=c)
## test if the first element is as expected
all.equal(cropl[[1]], crop(c, extent(boundl[[1]])))
[1] TRUE
Note:
Using c to denote an R object is a bade choice, because it can be easily confused with c().

export khrud object from kernelUD to raster

In R, how can I export a khrud object from function kernelUD in package adehabitat to a raster file (geoTiff)?
I tried following this thread (R: how to create raster layer from an estUDm object) using the code here:
writeRaster(raster(as(udbis1,"SpatialPixelsDataFrame")), "udbis1.tif")
where udbis1 is a khrud object, but I get "Error in as(udbis1, "SpatialPixelsDataFrame") : no method or default for coercing “khrud” to “SpatialPixelsDataFrame."
I think the issue may be that the old thread was before an update to the adehabitat package changed the data format from estUD to khrud. Maybe?
You do not provide a reproducible example. The following works for me:
library(adehabitatHR)
library(raster)
data(puechabonsp)
loc <- puechabonsp$relocs
ud <- kernelUD(loc[, 1])
r <- raster(as(ud[[1]], "SpatialPixelsDataFrame"))
writeRaster(r, filename = file.path(tempdir(), "ud1.tif"))
AdehabitatHR solutions work well for data that are in the required format or when using multiple animals. Though when wanting to create KDE with data organized differently or for only one source, it can be frustrating. For some reason, #johaness' answer doesn't work for my case so here is an alternative solution that avoids the headaches of going into adehabitatHR's innards.
library(adehabitatHR)
library(raster)
# Recreating an example for only one animal
# with a basic xy dataset like one would get from tracking
loc<-puechabonsp$relocs
loc<-as.data.frame(loc)
loc<-loc[loc$Name=="Brock",]
coordinates(loc)<-~X+Y
ud<-kernelUD(loc)
# Extract the UD values and coordinates into a data frame
udval<-data.frame("value" = ud$ud, "lon" = ud#coords[,1], "lat" = ud#coords[,2])
coordinates(udval)<-~lon+lat
# coerce to SpatialPixelsDataFrame
gridded(udval) <- TRUE
# coerce to raster
udr <- raster(udval)
plot(udr)

Resources