I'm attempting to figure out why I'm getting error messages with some simple raster algebra after changing the extent. To demonstrate this I thought I'd create a toy example following some code on another stack overflow question.
library(raster)
## Create a matrix with random data
xy <- matrix(rnorm(400),20,20)
# generate two extents to apply
globeExtent <- extent(c(-180, 180, -90, 90))
smallerExtent <- extent(c(-180, 180, -59.5, 83.5))
# Turn the matrix into a raster
rast.smallextent <- raster(xy)
extent(rast.smallextent) <- smallerExtent
rast.globeExtent <- setExtent(rast.smallextent, ext = globeExtent, keepres = TRUE)
mathtest <- rast.globeExtent - rast.smallextent
The mathtest code line fails because rast.globeExtent has no values so I can't actually use this to test for the errors I was seeing elsewhere. How do I expand the extent of this raster without losing all its data?
If I interpret correctly the question, what you need to do is not to change the extent of rast.smallextent, but to expand the raster, using function expand(). Something like this:
library(raster)
#> Loading required package: sp
library(tmap)
## Create a matrix with random data
xy <- matrix(rnorm(400),20,20)
# generate two extents to apply
globeExtent <- extent(c(-180, 180, -90, 90))
smallerExtent <- extent(c(-180, 180, -20, 20))
# Turn the matrix into a raster
rast.smallextent <- raster(xy)
extent(rast.smallextent) <- smallerExtent
tmap::tm_shape(rast.smallextent) + tmap::tm_raster() + tmap::tm_grid()
# extend the raster over a wider area, while keeping the values
#
rast.globeExtent <- extend(rast.smallextent, globeExtent)
# Now rast.globeExtent is "expanded", but values are still there:
rast.globeExtent
#> class : RasterLayer
#> dimensions : 90, 20, 1800 (nrow, ncol, ncell)
#> resolution : 18, 2 (x, y)
#> extent : -180, 180, -90, 90 (xmin, xmax, ymin, ymax)
#> crs : NA
#> source : memory
#> names : layer
#> values : -3.606916, 2.795636 (min, max)
tmap::tm_shape(rast.globeExtent) + tmap::tm_raster() + tmap::tm_grid()
# Math now works on the intersection, although results are "cropped" on
# the intersecting area
rast.globeExtent <- rast.globeExtent + 1 #add 1 to check math is correct
mathtest <- rast.globeExtent - rast.smallextent
#> Warning in rast.globeExtent - rast.smallextent: Raster objects have different
#> extents. Result for their intersection is returned
mathtest
#> class : RasterLayer
#> dimensions : 20, 20, 400 (nrow, ncol, ncell)
#> resolution : 18, 2 (x, y)
#> extent : -180, 180, -20, 20 (xmin, xmax, ymin, ymax)
#> crs : NA
#> source : memory
#> names : layer
#> values : 1, 1 (min, max)
tmap::tm_shape(mathtest) + tmap::tm_raster() + tmap::tm_grid()
HTH!
Created on 2019-12-13 by the reprex package (v0.3.0)
Related
I have a raster in R, I need to select the highest cell values up until 30% of the raster area is selected.
The way that I've tried to accomplish this is by calculating the average cell area, and then calculating how many cells I need to meet this 30% target (I know this is not entirely accurate). Then I sort the raster values, descending. Here is where I'm stuck. Of these sorted values, I need to set all cells beyond #12,678 to NA. I can't figure out how to set values to NA based on their place in an order. Does anyone know how to do this? Or have a better idea for the entire process?
you can use tidyterra for this.
Once that you have identified your thresold (12,678) use tidyterra::filter() on your raster.
I don't know if you are using raster or terra, I use here terra and I show you how to convert back to Raster* object.
On this example I present a full workflow with an example raster file, just adapt it to your data.
library(terra)
#> terra 1.5.21
library(tidyterra)
#> -- Attaching packages --------------------------------------- tidyterra 0.1.0 --
#>
#> Suppress this startup message by setting Sys.setenv(tidyterra.quiet = TRUE)
#> v tibble 3.1.7 v dplyr 1.0.9
#> v tidyr 1.2.0
# Create a SpatRaster from a file
f <- system.file("ex/elev.tif", package = "terra")
r <- rast(f)
# If you are using raster package use this line for switch it to terra:
# r <- rast(your_data)
r
#> class : SpatRaster
#> dimensions : 90, 95, 1 (nrow, ncol, nlyr)
#> resolution : 0.008333333, 0.008333333 (x, y)
#> extent : 5.741667, 6.533333, 49.44167, 50.19167 (xmin, xmax, ymin, ymax)
#> coord. ref. : lon/lat WGS 84 (EPSG:4326)
#> source : elev.tif
#> name : elevation
#> min value : 141
#> max value : 547
totarea_km <- expanse(r, unit = "km")
prettyNum(totarea_km, big.mark = ",")
#> [1] "2,563.61"
plot(r)
# There is a bug on slice_max, use filter
# Get thresold for filtering by working on the values of the raster
# as it is a data frame
min_value <- as_tibble(r, na.rm = TRUE) %>%
slice_max(order_by = elevation, prop = .3) %>%
min()
# Here it goes!!
top30perc <- r %>% filter(elevation > min_value)
# Check
area_30perc <- expanse(top30perc, unit = "km")
prettyNum(area_30perc, big.mark = ",")
#> [1] "761.0991"
# Check
area_30perc / totarea_km
#> [1] 0.2968857
# Seems ok
plot(top30perc)
# If you need to convert to Raster* object
raster::stack(top30perc)
#> class : RasterStack
#> dimensions : 90, 95, 8550, 1 (nrow, ncol, ncell, nlayers)
#> resolution : 0.008333333, 0.008333333 (x, y)
#> extent : 5.741667, 6.533333, 49.44167, 50.19167 (xmin, xmax, ymin, ymax)
#> crs : +proj=longlat +datum=WGS84 +no_defs
#> names : elevation
#> min values : 387
#> max values : 547
Created on 2022-05-31 by the reprex package (v2.0.1)
If you assume a constant cell size, then you might as well ignore it and you can do something like this (find the quantile of interest and use that as a threshold):
library(terra)
r <- rast(system.file("ex/elev.tif", package = "terra"))
q <- global(r, \(i) quantile(i, 0.7, na.rm=T))
x <- ifel(r <= q[[1]], NA, r)
plot(x)
Check
g <- global(c(x,r), "notNA")
g[1,1]/g[2,1]
# 0.2979601
If variation in cell size is important, you could do something like the below (sort the values, and find the cells below the cumulative cell size threshold), but note that this approach is not memory-safe (would fail with very large rasters)
s <- c(r, cellSize(r, unit="km"))
d <- as.data.frame(s, cell=T, na.rm=T)
d <- d[order(d$elevation, decreasing=TRUE), ]
d$sumarea <- cumsum(d$area) / sum(d$area)
cells <- d[d$sumarea <= 0.3, "cell"]
msk <- rast(r)
msk[cells] <- 1
x <- mask(r, msk)
plot(x)
I am trying to calculate the spatial correlation between two rasters. I have two large rasters with the same extent, resolution, etc
class : RasterLayer
dimensions : 45598, 53241, 2427683118 (nrow, ncol, ncell)
resolution : 30, 30 (x, y)
extent : 273366.8, 1870597, 367780.7, 1735721 (xmin, xmax, ymin, ymax)```
These layers have massive NAs cells
I tried to use terra::focalCor with the stack of those layers.
corr=focalCor(layerstack, w=9, cor)
But I have this issue
Error in v[[j - 1]] <- t(sapply(1:nrow(Y), function(i, ...) fun(X[i, ], :
more elements supplied than there are to replace
Any ideas or suggestions?
Cheers
It would have been easier to provide a specific answer with actual data provided to be able to reproduce your issue, but in this case it seems like you imported your gridded data using raster::raster() creating a RasterLayer object, but according to ?focalCor, x has clearly to be a SpatRaster with at least two layers.
So, try terra::rast(c("grid_1.tif", "grid_2.tif")) |> terra::focalCor(w = 9, cor) instead.
Edit:
Thanks for your reprex. I dared to reduce dimensions and modify the extent a little bit in order to reduce processing time:
library(terra)
r <- rast(ncols = 100, nrows = 100,
xmin = 0, xmax = 25, ymin = 0, ymax = 25,
crs = "epsg:4326")
r1 <- init(r, fun = runif)
r2 <- init(r, fun = runif)
r_stack <- c(r1, r2)
r_stack_cor_5 <- focalCor(r_stack, w = 5, cor)
r_stack_cor_5
#> class : SpatRaster
#> dimensions : 100, 100, 1 (nrow, ncol, nlyr)
#> resolution : 0.25, 0.25 (x, y)
#> extent : 0, 25, 0, 25 (xmin, xmax, ymin, ymax)
#> coord. ref. : lon/lat WGS 84 (EPSG:4326)
#> source : memory
#> name : lyr1
#> min value : -0.6476946
#> max value : 0.6948594
r_stack_cor_25 <- focalCor(r_stack, w = 25, cor)
r_stack_cor_25
#> class : SpatRaster
#> dimensions : 100, 100, 1 (nrow, ncol, nlyr)
#> resolution : 0.25, 0.25 (x, y)
#> extent : 0, 25, 0, 25 (xmin, xmax, ymin, ymax)
#> coord. ref. : lon/lat WGS 84 (EPSG:4326)
#> source : memory
#> name : lyr1
#> min value : -0.1020998
#> max value : 0.1045798
I used fun = cor instead of function(x, y) cor(x, y) but the result is the same according to all.equal(). However, your example seems to work - and I'm failing to recognize the issue at the moment.
I have a list of polygons that I want to use to subset a terra::rast brick with. To do this, I use terra::crop together with lapply, as show below, and it's fairly slow. Is there a vectorised way of subsetting with polygons rather than lapplying through the polygons?
Example
First, I load the libraries and create a rast object.
# Load library
library(terra)
library(geohashTools)
# Create raster
r <- rast(matrix(runif(360 * 2 * 180 *2), ncol = 360 * 2))
# Set extent
ext(r) <- c(-180, 180, -90, 90)
# Examine raster
r
## class : SpatRaster
## dimensions : 360, 720, 1 (nrow, ncol, nlyr)
## resolution : 0.5, 0.5 (x, y)
## extent : -180, 180, -90, 90 (xmin, xmax, ymin, ymax)
## coord. ref. :
## source : memory
## name : lyr.1
## min value : 7.853378e-07
## max value : 0.9999981
Next, I create a brick of these objects.
# Create a brick
b <- c(r, r, r, r, r)
## class : SpatRaster
## dimensions : 360, 720, 5 (nrow, ncol, nlyr)
## resolution : 0.5, 0.5 (x, y)
## extent : -180, 180, -90, 90 (xmin, xmax, ymin, ymax)
## coord. ref. :
## sources : memory
## memory
## memory
## ... and 2 more source(s)
## names : lyr.1, lyr.1, lyr.1, lyr.1, lyr.1
## min values : 7.853378e-07, 7.853378e-07, 7.853378e-07, 7.853378e-07, 7.853378e-07
## max values : 0.9999981, 0.9999981, 0.9999981, 0.9999981, 0.9999981
Here, I'm just creating a load of spatial polygons.
# All possible coordinates
coords <- expand.grid(-180:180, -90:89)
# Get all unique geohashes for raster
geohashes <- unique(gh_encode(coords$Var2, coords$Var1, precision = 4L))
# Convert to spatial polygons
sp <- geohashTools::gh_to_sp(geohashes)
Finally, I go through each polygon and use it to crop my brick.
# Crop raster using geohash polygon
b_cropped <- lapply(seq_along(sp), function(x) terra::crop(b, sp[x]))
Q: Is there a faster way to do this last step?
I have a data frame in which values (l) are specified for Cartesian coordinates (x, y) as in the following minimal working example.
set.seed(2013)
df <- data.frame( x = rep( 0:1, each=2 ),
y = rep( 0:1, 2),
l = rnorm( 4 ))
df
# x y l
# 1 0 0 -0.09202453
# 2 0 1 0.78901912
# 3 1 0 -0.66744232
# 4 1 1 1.36061149
I want to create a raster using the raster package, but my reading of the documentation has not revealed a simple method for loading data in the form that I have it into the raster cells. I've come up with a couple ways to do it using for loops, but I suspect that there's a much more direct approach that I'm missing.
An easier solution exists as
library(raster)
dfr <- rasterFromXYZ(df) #Convert first two columns as lon-lat and third as value
plot(dfr)
dfr
class : RasterLayer
dimensions : 2, 2, 4 (nrow, ncol, ncell)
resolution : 1, 1 (x, y)
extent : -0.5, 1.5, -0.5, 1.5 (xmin, xmax, ymin, ymax)
coord. ref. : NA
data source : in memory
names : l
values : -2.311813, 0.921186 (min, max)
Further, you may specify the CRS string. Detailed discussion is available here.
Here is one approach, via SpatialPixelsDataFrame
library(raster)
# create spatial points data frame
spg <- df
coordinates(spg) <- ~ x + y
# coerce to SpatialPixelsDataFrame
gridded(spg) <- TRUE
# coerce to raster
rasterDF <- raster(spg)
rasterDF
# class : RasterLayer
# dimensions : 2, 2, 4 (nrow, ncol, ncell)
# resolution : 1, 1 (x, y)
# extent : -0.5, 1.5, -0.5, 1.5 (xmin, xmax, ymin, ymax)
# coord. ref. : NA
# data source : in memory
# names : l
# values : -0.6674423, 1.360611 (min, max)
help('raster') describes a number of methods to create a raster from objects of different classes.
Updating this corresponding to #zubergu about Rasterizing irregular data.
So the answer that I have adapted from the link below and possibly makes it even more simpler to understand is:
library(raster)
library(rasterize)
# Suppose you have a dataframe like this
lon <- runif(20, -180, 180)
lat <- runif(20, -90, 90)
vals <- rnorm(20)
df <- data.frame(lon, lat, vals)
# will need to rename colnames for raster
colnames(df) <- c('x', 'y', 'vals')
# create a raster object
r_obj <- raster(xmn=-180, xmx=180, ymn=-90, ymx=90, resolution=c(5,5))
# use rasterize to create desired raster
r_data <- rasterize(x=df[, 1:2], # lon-lat data
y=r_obj, # raster object
field=df[, 3], # vals to fill raster with
fun=mean) # aggregate function
plot(r_data)
Original response:
for those of you like #yuliaUU looking to convert irregular data to a raster, please see #RobertH's answer here.
I have two SpatialPolygonsDataFrame files: dat1, dat2
extent(dat1)
class : Extent
xmin : -180
xmax : 180
ymin : -90
ymax : 90
extent(dat2)
class : Extent
xmin : -120.0014
xmax : -109.9997
ymin : 48.99944
ymax : 60
I want to crop the file dat1 using the extent of dat2. I don't know how to do it. I just handle raster files using "crop" function before.
When I use this function for my current data, the following error occurs:
> r1 <- crop(BiomassCarbon.shp,alberta.shp)
Error in function (classes, fdef, mtable) :
unable to find an inherited method for function ‘crop’ for signature"SpatialPolygonsDataFrame"’
Streamlined method added 2014-10-9:
raster::crop() can be used to crop Spatial* (as well as Raster*) objects.
For example, here's how you might use it to crop a SpatialPolygons* object:
## Load raster package and an example SpatialPolygonsDataFrame
library(raster)
data("wrld_simpl", package="maptools")
## Crop to the desired extent, then plot
out <- crop(wrld_simpl, extent(130, 180, 40, 70))
plot(out, col="khaki", bg="azure2")
Original (and still functional) answer:
The rgeos function gIntersection() makes this pretty straightforward.
Using mnel's nifty example as a jumping off point:
library(maptools)
library(raster) ## To convert an "Extent" object to a "SpatialPolygons" object.
library(rgeos)
data(wrld_simpl)
## Create the clipping polygon
CP <- as(extent(130, 180, 40, 70), "SpatialPolygons")
proj4string(CP) <- CRS(proj4string(wrld_simpl))
## Clip the map
out <- gIntersection(wrld_simpl, CP, byid=TRUE)
## Plot the output
plot(out, col="khaki", bg="azure2")
Here is an example of how to do this with rgeos using the world map as an example
This comes from Roger Bivand on R-sig-Geo mailing list. Roger is one of the authors of the sp package.
Using the world map as an example
library(maptools)
data(wrld_simpl)
# interested in the arealy bounded by the following rectangle
# rect(130, 40, 180, 70)
library(rgeos)
# create a polygon that defines the boundary
bnds <- cbind(x=c(130, 130, 180, 180, 130), y=c(40, 70, 70, 40, 40))
# convert to a spatial polygons object with the same CRS
SP <- SpatialPolygons(list(Polygons(list(Polygon(bnds)), "1")),
proj4string=CRS(proj4string(wrld_simpl)))
# find the intersection with the original SPDF
gI <- gIntersects(wrld_simpl, SP, byid=TRUE)
# create the new spatial polygons object.
out <- vector(mode="list", length=length(which(gI)))
ii <- 1
for (i in seq(along=gI)) if (gI[i]) {
out[[ii]] <- gIntersection(wrld_simpl[i,], SP)
row.names(out[[ii]]) <- row.names(wrld_simpl)[i]; ii <- ii+1
}
# use rbind.SpatialPolygons method to combine into a new object.
out1 <- do.call("rbind", out)
# look here is Eastern Russia and a bit of Japan and China.
plot(out1, col = "khaki", bg = "azure2")
You cannot use crop on sp polygon objects. You will need to create a polygon representing the bbox coordinates of dat2 and then can use gIntersects in the rgeos library.
Edit: This comment was in relation to the version available in 2012 and this is no longer the case.
see ?crop
corp(x, y, filename="", snap='near', datatype=NULL, ...)
x Raster* object
y Extent object, or any object from which an Extent object can be
extracted (see Details
You need to rasterize the first SpatialPolygon, using rasterize function from the raster package
I create some data to show how to use rasterize:
n <- 1000
x <- runif(n) * 360 - 180
y <- runif(n) * 180 - 90
xy <- cbind(x, y)
vals <- 1:n
p1 <- data.frame(xy, name=vals)
p2 <- data.frame(xy, name=vals)
coordinates(p1) <- ~x+y
coordinates(p2) <- ~x+y
if I try :
crop(p1,p2)
unable to find an inherited method for function ‘crop’ for signature ‘"SpatialPointsDataFrame"’
Now using rasterize
r <- rasterize(p1, r, 'name', fun=min)
crop(r,p2)
class : RasterLayer
dimensions : 18, 36, 648 (nrow, ncol, ncell)
resolution : 10, 10 (x, y)
extent : -180, 180, -90, 90 (xmin, xmax, ymin, ymax)
coord. ref. : +proj=longlat +datum=WGS84
data source : in memory
names : layer
values : 1, 997 (min, max)