Related
I am trying to mask a raster to a shapefile boundary, but I am getting an error. How can I correctly perform this mask?
The raw data can be found here, entitled "data_for_question.txt." It is formatted so that users can copy and paste (from the web app) the text directly into an R window and generate a data frame. Otherwise, if one doesn't want to generate the data, the output raster (example_raster.tif) and shapefile (field_boundary.shp) can both also be found in the same link.
Here is what I have tried:
#Import necessary libraries
library(pacman)
p_load(sf,
spatstat,
maptools,
tidyverse,
ggplot2,
gstat,
sp,
rgdal,
raster,
spdep)
#Read shapefile
shp <- st_read("field_boundary.shp")
#Generate data to run interpolation on and project it to the desired CRS
data_sp <- SpatialPointsDataFrame(coords,
data[, c("OM", "data2")],
proj4string = CRS('+proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs'))
#Perform an IDW interpolation:
grd <- SpatialPixels(SpatialPoints(makegrid(data_sp, n=10000)), proj4string = proj4string(data_sp)) #Generate grid for interpolation
plot(grd)
interp <- idw(formula = OM ~ 1, data_sp, grd, idp = 0.5, nmax = 12)
plot(interp) #Makes for a very pretty picture!
#Convert to raster
rast <- raster(interp)
plot(rast)
shp <- st_transform(shp, crs(rast))
#Crop and mask the raster
crop_rast <- crop(rast, shp)
crop_om <- mask(crop_rast, mask = shp)
The error occurs here:
Error in h(simpleError(msg, call)) :
error in evaluating the argument 'x' in selecting a method for function 'addAttrToGeom': sp supports Z dimension only for POINT and MULTIPOINT.
use `st_zm(...)` to coerce to XY dimensions
I need to calculate the magnitude-per-unit area of polylines that fall within a radius around each cell. Essentially I need to calculate a km/km2 road density within a 500m pixel search radius. ArcMap has a quick and easy tool that handles this, but I need a pure R solution.
Here is a link on how line density works: http://desktop.arcgis.com/en/arcmap/10.3/tools/spatial-analyst-toolbox/how-line-density-works.htm
And this is how to use it in a python (arcpy) script: http://desktop.arcgis.com/en/arcmap/10.3/tools/spatial-analyst-toolbox/line-density.htm
I currently execute a backwards approach using raster::focal function, calculating a density of burned in road features. I then convert the km2/km2 output to km/km2.
#Import libraries
library(raster)
library(rgdal)
library(gdalUtils)
#Read-in an already created raster mask (cells are all set to 0)
mask <- raster("x://path to raster mask...")
#Make a copy of the mask to burn features in, keeping the original untouched
roads_mask <- file.copy(mask, "x://output path ...//roads.tif")
#Read-in road features (shapefile format)
roads_sldf <- readOGR("x://path to shapefile" , "roads")
#Rasterize spatial lines data frame ie. burn road features into mask
#Where road features get a value of 1, mask extent gets a value of 0
roads_raster <- gdalUtils::gdal_rasterize(src_datasource = roads_sldf,
dst_filename = "x://output path ...//roads.tif", b = 1,
burn = 1, l = "roads", output_Raster = TRUE)
#Run a 1km circular radius density function (be mindful of edge effects)
weight <- raster::focalWeight(roads_raster,1000,type = "circle")
1km_rdDensity <- raster::focal(roads_raster, weight, fun=sum, filename = '',
na.rm=TRUE, pad=TRUE, NAonly=FALSE, overwrite=TRUE)
#Convert km2/km2 road density to km/km2
#Set up the moving window
weight <- raster::focalWeight(roads_raster,1000,type = "circle")
#Count how many records in each column of the moving window are > 0
columnCount <- apply(weight,2,function(x) sum(x > 0))
#Get the sum of the column count
number_of_cells <- sum(columnCount)
#multiply km2/km2 density by number of cells in the moving window
step1 <- roads_raster * number_of_cells
#Rescale step1 output with respect to cell size(30m) and radius of a circle
final_rdDensity <- (step1*0.03)/3.14159265
#Write out final km/km2 road density raster
writeRaster(final_rdDensity,"X://path to output...", datatype = 'FLT4S', overwrite = TRUE)
After some more research I think I may be able to use a kernel function, however I don't want to apply the smoothing algorithm... As well the output is an 'im' object which I would need to write to as a 'tif'
#Import libraries
library(spatstat)
library(rgdal)
#Read-in road features (shapefile format)
roads_sldf <- readOGR("x://path to shapefile" , "roads")
#Convert roads spatial lines data frame to psp object
psp_roads <- as.psp(roads_sldf)
#Apply kernel density, however this is where I am unsure of the arguments
road_density <- spatstat::density.psp(psp_roads, sigma = 0.01, eps = 500)
Cheers.
See this question https://gis.stackexchange.com/questions/138861/calculating-road-density-in-r-using-kernel-density
Tried to mark as a duplicate but doesn't work because the other Q is on gis stack exchange
Short answer is use spatstat.geom::pixellate()
I also needed spatstat.geom::as.psp(sf::st_geometry(x)) to convert an sf lines object to the correct format and maptools::as.im.RasterLayer(r) to convert a raster. I was able to convert the result to RasterLayer with raster::raster(pix_res)
Perhaps you can use terra::rasterizeGeom which is available in the development version that you can install with install.packages('terra', repos='https://rspatial.r-universe.dev')
Example data
library(terra)
f <- system.file("ex/lux.shp", package="terra")
v <- vect(f) |> as.lines()
r <- rast(v, res=.1)
Solution
x <- rasterizeGeom(v, r, fun="length", "km")
And then use focal sum, but you would not have a perfect circle.
What you could do instead, if your dataset is not too large, is create a circle for each grid cell and use intersect. Something like this:
p <- xyFromCell(r, 1:ncell(r)) |> vect(crs="+proj=longlat")
p$id <- 1:ncell(r)
b <- buffer(p, 10000)
values(v) <- NULL
i <- intersect(v, b)
x <- aggregate(perim(i), list(id=i$id), sum)
r[x$id] <- x[,2]
I would like to sample a big raster by creating In small raster 100x100 cells.
I don't know how to do that so any ideas are welcome
My actual lead :
library(raster)
library(spatstat)
library(polyCub)
r <- raster(ncol=1000,nrow=1000) # create empty raster
r[] <- 1:(1000*1000) # Raster for testing
e <- extent(r) # get extend
# coerce to a SpatialPolygons object
p <- as(e, 'SpatialPolygons')
nc <- as.owin.SpatialPolygons(p) #polyCub
pts <- rpoint(50, win = nc)
plot(pts)
Now I need to generate 100x100 cell square around my 50 points and I would like to crop r using those square and stack each small raster individually ...
The answer by #adrian-baddeley basically has the ingredients to do what
you want. If you simply want a list of small im objects that contain
the 100x100 box you simply subset im objects by owin objects to
extract the relevant region. Here is an example (with fewer points to
avoid overplotting)
library(raster)
library(spatstat)
library(maptools)
r <- raster(ncol=1000,nrow=1000) # create empty raster
r[] <- 1:(1000*1000) # Raster for testing
e <- extent(r) # get extend
# coerce to a SpatialPolygons object
p <- as(e, 'SpatialPolygons')
nc <- as.owin.SpatialPolygons(p)
set.seed(42)
pts <- rpoint(7, win = nc)
rim <- as.im.RasterLayer(r)
Box <- owin(c(-50,50) * rim$xstep, c(-50,50) * rim$ystep)
The following is a list of im objects of size 100x100
imlist <- solapply(seq_len(npoints(pts)),
function(i) rim[shift(Box, pts[i])])
Here is a plot of the im objects in the region and the points on top
plot(pts)
for(i in imlist) plot(i, add = TRUE)
plot(pts, pch = 19, add = TRUE)
You can convert to a list of raster layers with
rasterList <- lapply(imlist, as, Class = "RasterLayer")
PS: The following is a list of im objects of the original size with
NA outside the 100x100 box if you need that format instead
imlist <- solapply(seq_len(npoints(pts)),
function(i) rim[shift(Box, pts[i]), drop = FALSE])
If you want to use spatstat then you need to convert the raster object r into an object of class im supported by spatstat. You can do this conversion in the maptools package. Call this image object rim. Then you can do as follows
Box <- owin(c(-50,50) * rim$xstep, c(-50,50) * rim$ystep)
BoxesUnion <- MinkowskiSum(pts, Box)
W <- intersect.owin(as.mask(rim), BoxesUnion)
This would give you the subset of the raster that is covered by the squares.
If you want to keep the squares separate, do something like
M <- as.mask(rim)
BoxList <- solapply(seq_len(npoints(pts)),
function(i) intersect.owin(M, shift(Box, pts[i])))
Then BoxList is a list of the individual sub-rasters.
I have some gridded data of sea surface temperature values in the Mediterranean to which I've applied clustering. I have 420 files with three columns structure (long,lat,value). The data for a particular file looks like this map
Now I want to extract the cluster areas as shapefile for postprocessing. I have found this post (https://gis.stackexchange.com/a/187800/9227) and tried to use its code like this
# Packages
library(sp)
library(rgdal)
library(raster)
# Paths
ruta_datos<-"/home/meteo/PROJECTES/VERSUS/OUTPUT/DATA/CLUSTER_MED/"
setwd("~/PROJECTES/VERSUS/temp")
# File list
files <- list.files(path = ruta_datos, pattern = "SST-cluster-mitja-mensual")
for (i in 1:length(files)){
datos<-read.csv(paste0(ruta_datos,files[i],sep=""),header=TRUE)
nclusters<-max(datos$cluster)
for (j in 1:nclusters){
clust.dat<-subset(datos, cluster == j)
coordinates(clust.dat)=~longitud+latitud
proj4string(clust.dat)=CRS("+init=epsg:4326")
pts = spTransform(clust.dat,CRS("+init=epsg:4326"))
gridded(pts) = TRUE
r = raster(pts)
projection(r) = CRS("+init=epsg:4326")
# make all values the same. Either do
s <- r > -Inf
# convert to polygons
pp <- rasterToPolygons(s, dissolve=TRUE)
# save shapefile
shname<-paste("SST-shape-",substr(files[i],27,32),"-",j,sep="")
writeOGR(pp, dsn = '.', layer = shname, driver = "ESRI Shapefile")
}
}
But the code stops for with this error message
gridded(pts) = TRUE
suggested tolerance minimum: 1
Error in points2grid(points, tolerance, round) : dimension 2
: coordinate intervals are not constant
Warning message: In points2grid(points, tolerance, round) : grid has empty
column/rows in dimension 1
I don't understand that at a certain file it says that coordinate intervals are not constant while they indeed are, original SST data from which clustering was derived are on a regular grid over the whole globe. All cluster data files have the same size, 4248 points. A sample data file is available here
What does the tolerance suggestion means? I've been looking for a solution and found some suggestion to use SpatialPixelsDataFrame but couldn't find out how to apply.
Any help would be appreciated. Thanks.
I am not an expert of geospatial data but for me, if you filter on cluster, data are indeed not on a grid. So far as I understand, you start from a grid (convex set of regularly distant points).
I tried following modifications to your code and some files are generated but I can't test whether they are correct or not.
Principle is to build the grid on all data then only filter on cluster before calling raster.
This gives:
files <- list.files(path = ruta_datos, pattern = "SST-cluster-mitja-mensual")
for (i in 1:length(files)){
datos<-read.csv(paste0(ruta_datos,files[i],sep=""),header=TRUE)
nclusters<-max(datos$cluster)
for (j in 1:nclusters){
## clust.dat<-subset(datos, cluster == j)
clust.dat <- datos
coordinates(clust.dat)=~longitud+latitud
proj4string(clust.dat)=CRS("+init=epsg:4326")
pts = spTransform(clust.dat,CRS("+init=epsg:4326"))
gridded(pts) = TRUE
## r = raster(pts)
r= raster(pts[pts$cluster==j,])
projection(r) = CRS("+init=epsg:4326")
# make all values the same. Either do
s <- r > -Inf
# convert to polygons
pp <- rasterToPolygons(s, dissolve=TRUE)
# save shapefile
shname<-paste("SST-shape-",substr(files[i],27,32),"-",j,sep="")
writeOGR(pp, dsn = '.', layer = shname, driver = "ESRI Shapefile")
}
}
So, two lines in comment and update just the line below.
My problem is simple. I have found very good package called adehabitat in R. To use it I need to transform my data into specificaly structured object containing raster map data and coordinates of an animal. To see it please type:
# example data in adahabitat package
data(bauges)
bauges
str(bauges)
How do I convert my data (bellow) into such structure? I figured out how to convert $locs into SpatialPoints, but I don't know how to convert map (in my example are raster values categorical codes of individual types of habitat -i.e. not continuous variable).
# My example data:
library(raster)
library(adehabitatHS)
# map
habitat_type_temp <- matrix(c(1,1,1,1,1,1,1,1,2,2,
1,1,2,2,1,1,1,2,2,2,
1,2,2,2,3,3,3,2,2,2,
2,2,2,1,1,1,3,2,2,1,
2,2,1,1,1,1,3,2,1,1,
2,1,1,1,1,1,3,3,1,1,
2,1,1,1,1,3,3,3,3,1,
1,1,1,1,1,1,1,3,3,3), 10)
habitat_type <- t(habitat_type_temp)
# coordinates
animal_coords <- data.frame(x = c(2,4,5,5,6,9),
y = c(2,8,3,2,4,3))
# see the situation
plot(raster(habitat_type, xmn=1, xmx=10, ymn=1, ymx=8))
points(animal_coords$x, animal_coords$y)
# creating object which could be manipulated in adehabitat package
my.hab <- list()
my.hab$map <- SpatialPixelsDataFrame(...)
my.hab$locs <- SpatialPoints(animal_coords)
Is it even possible to insert such manually fabricated data into such specific type of object, or I need some original tiff with specific CRS?
You could just drop the location somewhere to produce the SpatialPixelsDataFrame, I think this is roughly Iowa:
x <- 93+rep(1:8,each=10)/100
y <- rep(seq(42.01,42.1,by=0.01), 8)
z <- c(1,1,1,1,1,1,1,1,2,2,
1,1,2,2,1,1,1,2,2,2,
1,2,2,2,3,3,3,2,2,2,
2,2,2,1,1,1,3,2,2,1,
2,2,1,1,1,1,3,2,1,1,
2,1,1,1,1,1,3,3,1,1,
2,1,1,1,1,3,3,3,3,1,
1,1,1,1,1,1,1,3,3,3)
xy.df <- data.frame(x,y)
xy.coords <- SpatialPixels(SpatialPoints(xy.df))
llCRS <- CRS("+proj=utm +zone=15 +ellps=WGS84")
xy.sp <- SpatialPoints(xy.coords, proj4string = llCRS)
xyz <- as.data.frame(cbind(x,y,z))
xyz.spdf <- SpatialPixelsDataFrame(xy.coords, xyz)
plot(xyz.spdf)
Your spatialpoints would have to be changed similarly.