Raster calculation in R - r

I have two files from these website:
https://sedac.ciesin.columbia.edu/data/set/gpw-v4-population-count-rev11/data-download
And a shapefile of China from these website
https://gadm.org/download_country_v3.html
I would like to compute the difference between the raster population layers, that I can show a map where each pixel represents the change in the population in China.
I used this code
library(raster)
library(sf)
library(tmap)
p_15 <- terra::rast("gpw-v4-population-count-rev11_2015_2pt5_min_tif/gpw_v4_population_count_rev11_2015_2pt5_min.tif")
p_20 <- terra::rast("gpw-v4-population-count-rev11_2020_2pt5_min_tif/gpw_v4_population_count_rev11_2020_2pt5_min.tif")
CHN <- sf::read_sf("gadm36_CHN_shp/gadm36_CHN_1.shp")
CHN <- sf::st_transform(CHN, crs="epsg:4490")|> terra::vect()
p_15<- terra::project(p_15,'EPSG:4490')
p_20 <- terra::project(p_20,'EPSG:4490')
p_15_crop <- terra::crop(p_15, CHN)
p_20_crop <- terra::crop(p_20, CHN)
p_15_mask <- mask(p_15_crop, CHN)
p_20_mask <- mask(p_2_crop, CHN)
The code above everything works fine.
Now I used overlay from the raster package to calculate the difference between the population layers to show the change in each pixel.
I gave these code
diff1520 <- overlay(p_15_mask, p_20_mask, fun=function(x,y){return(y-x)})
But I got the error message method not applicable??? What is wrong with the code?
By the way, I also used geodata package, but did not solve my problem

Simply subtracting the objects will work. But if you still want to apply a function to a SpatRaster, you can use terra::lapp, which is equivalent to raster::overlay. The main difference is that you have to combine the layers first.
library(terra)
p_mask <- c(p_15_mask, p_20_mask)
diff1520 <- lapp(p_mask, fun=function(x,y){return(y-x)})

It's probably because you created your masks with terra. So the masks are SpatRast objects and you tried to use the overlay() function from raster and that only works with raster objects.
You can do what you want with
diff1520 <- p_20_mask - p_15_mask
That's the basic terra way.

Related

tmap is plotting a different legend (range of values?) for a cropped rasterlayer compared to original raster

I am extremely new to working with spatial data and so most of what I'm about to say is me trying to speak a foreign language. Right now I am trying to learn how to do this all in R (I am slightly more capable with this data in QGIS but for this solution, I am looking for R only).
My research involves ecological data in Pennsylvania (PA) and so I am playing around with cropping the US NLCD dataset to PA. I have a raster layer for the NLCD and a shapefile for the boundary of Pennsylvania. I am able to successfully crop the larger US raster down to PA as follows:
library(raster)
library(rgdal)
pabound <- readOGR(dsn="...",
layer="PAbound")
nlcdRast <- raster(".../NLCD_2016_Land_Cover_L48_20190424.img")
pabound <- spTransform(pabound,CRS(proj4string(nlcdRast)))
PAnlcd <- raster::crop(nlcdRast,pabound)
If I run the simple plot command for both nlcdRast and PAnlcd (i.e. plot(nlcdRast) they maintain the same color scheme. But when I run it through tmap it seems to look at the cropped data differently and I am not exactly sure how to figure this out. Please see the plots below:
library(tmap)
tm_shape(nlcdRast) +
tm_raster()
And then when I plot the cropped version in tmap:
tm_shape(PAnlcd) +
tm_raster()
As you can see, it is not simply the color palette that is changing (I am confident I could figure that out) but the real problem is I'm losing the important information as seen in the legend. Whereas the full plot actually shows the categorical values for the raster NLCD, the cropped version now seems to show just some unknown numerical range. Even though it looks bad at the moment, I'd like to have the same legend/information as seen in the full US map.
I apologize for not having a more reproducible example but I am completely lost on what is happening here so I can't quite replicate it. I suppose right now I'm just looking for where to look to try and figure out what changed. Thank you in advance.
Cropping is changing the way the pixels are represented. To maintain your values use the stars package (also note I'm using the sf package for the shapefile):
library(stars)
library(sf)
# load in NLCD
nlcdRast <- read_stars(".../NLCD_2016_Land_Cover_L48_20190424.img")
# read in study area
pabound <- st_read(dsn="...", layer="PAbound")
# reproject pabound to match NLCD
pabound <- st_transform(pabound, CRSobj = crs(nlcdRast))
# now crop
panlcd <- st_crop(nlcdRast, pabound)

unwanted subgeometries when converting raster to polygons

I am converting many rasters to polygon. But in quite a few cases, I am seeing unexpected subgeometries, and I can't seem to get rid of them.
This is with R v3.3.3 and raster package v2.5-8.
Here is an example that should reproduce the problem I am having.
You can download the raster that I use here.
# first, read in raster and coarsen to something more manageable
library(raster)
library(rgeos)
env <- raster('adefi.tif')
env2 <-aggregate(env, 8)
# Reclassify such that cells are either 1 or NA
env2[!is.na(env2)] <- 1
# this is what the raster now looks like:
plot(env2)
# Now I convert to polygon, choosing to dissolve
p <- rasterToPolygons(env2, dissolve=T)
plot(p)
# I find that I can't get rid of these subgeometries
p <- gUnaryUnion(p) # identical result
gIsValid(p) # returns TRUE
I'm not sure where the problem is... Is it in how the raster package converts to cell polygons? Or is it how the rgeos package dissolves those cell polygons together?
Is there a work-around?
It looks like a projection issue. This works for me:
library(raster)
library(rgeos)
env <- raster(file.path(fp, "adefi.tif"))
env2 <- aggregate(env, 8)
env2[is.na(env2) == F] <- 1
# Project Raster
proj_env2 <- projectRaster(env2, crs = CRS("+init=epsg:3577"))
p <- rasterToPolygons(proj_env2, dissolve = T)
plot(p)
Not sure why the need for reprojection since epsg:3577 looks to be the same as the original projection, but I usually confirm projection using proj4string() or spTransform() to make sure everything will line up.

R: Gstat universal cokriging resolution

I am trying to do universal cokriging in R with the Gstat package. I have a script that i was helped with, but now i'm stuck and can't ask assistance from the original source.
The problem is that i can't change the output resolution of the cokriged data. I would like to import the interpolated map to ArcMap and point-to-raster leaves me with a very low resolution.
My script is as follows:
library(raster)
library(gstat)
library(sp)
library(rgdal)
library(FitAR)
Loading my dataset, that containes coordinates and sampled values:
kova<-read.table("katvus_point_modif3.txt",sep=" ",header=T)
coordinates(kova)=~POINT_X+POINT_Y
Loading depth values at the same coordinates as the previous, this is my covariate:
Sygavus<-read.table("sygavus_point_cokrig.txt",sep=" ",header=T)
coordinates(Sygavus)=~POINT_X+POINT_Y
overlay <- over(kova,Sygavus)
kova$Sygavus <- overlay$Sygavus
This is supposed to set the boundary for my interpolation, the file is an exported shapefile from ArcMap:
border <- shapefile("area_2014.shp")
projection(kova)=projection(border)
This is supposed to create a grid for cokriging and the res= should let me specify what resolution i want the output to be, but no matter what number i use the output does not change.
grid <- spsample(border,type="regular",res=25)
I remove overlaping points:
zero <- zerodist(kova)
kova <- kova[-zero[,2],]
I load in the depth covariate raster-file. This is a depth raster export from ArcMap to ascii form:
depth <- raster("htp_depth_covar.asc")
projection(depth)=projection(border)
overlay <- extract(depth,kova)
kova$depth <- overlay
I remove na! values from the overlain depth values (These values should be the same as the previously loaded depth covariate table at the respective coordinates, but if i leave that part out, the script stops functioning)
kova <- kova[!is.na(kova$depth),]
kova.gstat <- gstat(id="Kova",formula=kova~depth,data=kova)
kova.gstat <- gstat(kova.gstat,id="Sygavus",formula=Sygavus~depth,data=kova)
var.kova <- variogram(kova.gstat)
plot(var.kova)
kova.gstat <- gstat(kova.gstat,id=c("Kova","Sygavus"),model=vgm(psill=cov(kova$kova,kova$Sygavus),model="Mat",range=12000,nugget=0))
kova.gstat <- fit.lmc(var.kova,kova.gstat,model=vgm(psill=cov(kova$kova,kova$Sygavus),model="Mat",range=12000,nugget=0))
plot(var.kova,kova.gstat$model)
overlay <- extract(depth,grid)
grid <- as.data.frame(grid)
grid$depth <- overlay
coordinates(grid)=~x1+x2
projection(grid)=projection(border)
krige <- predict.gstat(kova.gstat,grid)
spplot(krige,c("Kova.pred"))
write.table(krige, "kova.raster1.ck.csv", sep=";", dec=",", row.names=F)
Any help in understanding the gstat cokriging and the script overall would be greatly appreciated!
Because you don't provide a reproducible example I can only guess, but I think that spsample ignores the res=25 argument. Try n=1000 instead and then increase that value to get higher resolution.

How to calculate geo-distance from a polygon?

I have a shapefile with 50+ different polygonal shapes (representing 50+ different regions) and 10,000+ data points that are supposed to be present in one of the regions. The thing is, the 10,000+ points are already coded with a region they are supposed to be in, and I want to figure out how far they are from this coded region in geo-spatial distance.
My current approach (code below), which involves converting shapefiles to owin objects from the sp library and using distfun gets me distances in lat,long euclidean space. But I would like to get geo-spatial distances (eventually to convert to km). Where should I go next?
#basically cribbed from http://cran.r-project.org/web/packages/spatstat/vignettes/shapefiles.pdf (page 9)
shp <- readShapeSpatial("myShapeFile.shp", proj4string=CRS("+proj=longlat +datum=WGS84"))
regions <- lapply(slot(shp, "polygons"), function(x) SpatialPolygons(list(x)))
windows <- lapply(regions, as.owin)
# need to convert this to geo distance
distance_from_region <- function(regionData, regionName) {
w <- windows[[regionName]]
regionData$dists <- distfun(w)(regionData$lat, regionData$long)
regionData
}
I'd project the data to a euclidean (or near euclidean) coordinate system - unless you are spanning a large chunk of the globe then this is feasible. Use spTransform from maptools or sp or rgdal (I forget which) and convert to a UTM zone near your data.
You also might do better with package rgeos and the gDistance function:
gDistance by default returns the cartesian minimum distance
between the two geometries in the units of the current projection.
If your data is over a large chunk of globe then... tricky... 42...
Barry

Intersecting Points and Polygons in R

I am working with shapefiles in R, one is point.shp the other is a polygon.shp.
Now, I would like to intersect the points with the polygon, meaning that all the values from the polygon should be attached to the table of the point.shp.
I tried overlay() and spRbind in package sp, but nothing did what I expected them to do.
Could anyone give me a hint?
With the new sf package this is now fast and easy:
library(sf)
out <- st_intersection(points, poly)
Additional options
If you do not want all fields from the polygon added to the point feature, just call dplyr::select() on the polygon feature before:
library(magrittr)
library(dplyr)
library(sf)
poly %>%
select(column-name1, column-name2, etc.) -> poly
out <- st_intersection(points, poly)
If you encounter issues, make sure that your polygon is valid:
st_is_valid(poly)
If you see some FALSE outputs here, try to make it valid:
poly <- st_make_valid(poly)
Note that these 'valid' functions depend on a sf installation compiled with liblwgeom.
If you do overlay(pts, polys) where pts is a SpatialPointsDataFrame object and polys is a SpatialPolygonsDataFrame object then you get back a vector the same length as the points giving the row of the polygons data frame. So all you then need to do to combine the polygon data onto the points data frame is:
o = overlay(pts, polys)
pts#data = cbind(pts#data, polys[o,])
HOWEVER! If any of your points fall outside all your polygons, then overlay returns an NA, which will cause polys[o,] to fail, so either make sure all your points are inside polygons or you'll have to think of another way to assign values for points outside the polygon...
You do this in one line with point.in.poly fom spatialEco package.
library(spatialEco)
new_shape <- point.in.poly(pts, polys)
from the documentation: point.in.poly "intersects point and polygon feature classes and adds polygon attributes to points".

Resources