Function to return values (velox raster) - r

I'm using the new velox extract function to speed up raster extraction by shapefiles.
The old raster package's extract function by default returned a list of cell values, for example when you use the below format:
val.list <- raster::extract(raster, shapefile)
The new velox package requires a fun= argument and I can't for the life of me get it to return the values:
vx.raster <- velox(raster)
vx.vals <- vx.raster$extract(shapefile, fun=??????)
I have tried:
fun=values (returns error Error during wrapup: unable to find an inherited method for function 'values' for signature 'numeric'
fun=function(x){values(x)} (same error as above)
I get fun=sum, fun=mean to work just fine. Whats up with values? Am I just missing something obvious about a numeric vector and returning a values list (which I feel is the most likely case)?
Thank you!

The development version of velox (on github) now allows returning 'raw' raster values from a VeloxRaster_extract query. Just set the fun argument to NULL.
Here's an example:
library(devtools)
install_github('hunzikp/velox')
library(velox)
## Make VeloxRaster with two bands
set.seed(0)
mat1 <- matrix(rnorm(100), 10, 10)
mat2 <- matrix(rnorm(100), 10, 10)
vx <- velox(list(mat1, mat2), extent=c(0,1,0,1), res=c(0.1,0.1),
crs="+proj=longlat +datum=WGS84 +no_defs")
## Make SpatialPolygons
library(sp)
library(rgeos)
coord <- cbind(0.5, 0.5)
spoint <- SpatialPoints(coords=coord)
spols <- gBuffer(spgeom=spoint, width=0.5)
## Extract
vx$extract(sp=spols, fun=NULL)$buffer
# [,1] [,2]
# [1,] 1.27242932 0.04658030
# [2,] 0.41464143 -1.13038578
# [3,] -1.53995004 0.57671878
# etc....

Simply try this snippet
vx.raster$crop(shapefile).

Related

Zonal statistics to get majority pixel value per polygon in R?

Actually I try to calculate the major pixel values from a raster with a SpatialPolygonsDataFrame. Here is some code I found which might lead in the right direction:
library(raster)
# Create interger class raster
r <- raster(ncol=36, nrow=18)
r[] <- round(runif(ncell(r),1,10),digits=0)
r[]<-as.integer(r[])
# Create two polygons
cds1 <- rbind(c(-180,-20), c(-160,5), c(-60, 0), c(-160,-60), c(-180,-20))
cds2 <- rbind(c(80,0), c(100,60), c(120,0), c(120,-55), c(80,0))
polys <- SpatialPolygonsDataFrame(SpatialPolygons(list(Polygons(list(Polygon(cds1)), 1),
Polygons(list(Polygon(cds2)),2))),data.frame(ID=c(1,2)))
# Extract raster values to polygons
( v <- extract(r, polys) )
# Get class counts for each polygon
v.counts <-lapply(v,table)
So far everything is fine but I´m really stuck to extract the column name of the column which has the highest counts.
I tried things like:
v.max<- lapply(v.counts,max)
But there the column information gets lost. After:
v.max<- lapply(v.counts, max.col)
I get just "1" as result.
I´d appreciate if somebody can give me a hint what I´m doing wrong. Is there also another way to extract the major pixel values in a polygon?
which.max() is your friend. Since you just want the names, use names().
sapply(v.counts, function(x) names(x)[which.max(x)])
# [1] "9" "5"
Note: set.seed(42)
exactextractr package can do this trick. It computes zonal statistics even faster than terra in some cases. See comparison here
library(exactextractr)
exact_extract(r, polys, 'majority')
#> Warning in .exact_extract(x, sf::st_as_sf(y), ...): No CRS specified for
#> polygons; assuming they have the same CRS as the raster.
#> |======================================================================| 100%
#> [1] 4 2
You can use the modal function
v <- extract(r, polys, modal)

Error when attempting distance() with raster

I have been trying to get a graph from using distance() in the raster package. The raster dimensions are inherited from a SpatialPointsDataFrame. The raster works fine until I try distance(raster) and get the following warning:
Warning message:
In matrix(v, ncol = tr$nrows[1] + 3) :
data length [8837790] is not a sub-multiple or multiple of the number of rows [4384]
The bizarre thing is the raster works at smaller resolution but not large. The error can be replicated below:
Fails:
library(raster)
r <- raster(ncol=4386,nrow=6039)
r[] <- NA
r[500] <- 1
dist <- distance(r)
plot(dist / 1000)
Works:
r <- raster(ncol=438.6,nrow=603.9)
r[] <- NA
r[500] <- 1
dist <- distance(r)
plot(dist / 1000)
Why? Have I missed something really obvious?
An update to raster_2.4-20 solved the problem. Thanks Pascal and RobertH for pointing me in the right direction.

Raster grid position/coordinates of pixel(s) matching a value in R

Is there a way to extract the grid position or (preferably for rasters with an explicit extent) point/centroid coordinates of the pixels that match a particular value? I nearly have a pretty inefficient workflow converting to matrix and using which(mtrx == max(mtrx), arr.ind = TRUE) to get the matrix position(s), but this (a) loses geospatial information and (b) causes data to rotate 90 degrees in the matrix conversion process, both of which requiring extra code to make it work and slow the computations significantly. Is there an equivalent raster workflow anyone is aware of?
Example data:
library(raster)
set.seed(0)
r <- raster(ncols=10, nrows=10)
r[] <- sample(50, 100, replace=T)
Now do:
p <- rasterToPoints(r, function(x) x == 11)
To get
x y layer
[1,] 18 81 11
[2,] -126 63 11
[3,] -90 45 11
[4,] 54 -63 11
If you want the cell(s) with the maximum value
vmax = maxValue(r)
p <- rasterToPoints(r, function(x) all.equal(x, vmax)
(do not use #data#max)
I do not understand why you would coerce to a matrix? Perhaps I do not understand your question but, if I get you correctly, you could just query the raster values and then coerce to points to get the geographic position(s).
require(raster)
r <- raster(ncols=100, nrows=100)
r[] <- runif(ncell(r), 0,1)
# Coerce < max to NA and coerce result to points
rMax <- r
m = maxValue(r)
rMax[rMax != m] <- NA
( r.pts <- rasterToPoints (rMax) )
# You could also use the raster specific Which or which.max functions.
i <- which.max(r)
xy.max <- xyFromCell(r, i)
plot(r)
points(xy.max, pch=19, col="black")
# Or for a more general application of Which
i <- Which(r >= 0.85, cells=TRUE)
xy.max <- xyFromCell(r, i)
plot(r)
points(xy.max, pch=19, col="black")
# If you prefer a raster object set cells=FALSE
i <- Which(r >= 0.85, cells=FALSE)
plot(i)
There are multiple raster functions that will allow you to pass custom or base functions to them. You may want to take a look at "focal" which is a local operator or "calc" . You may want to also read through the help related to raster.
To extend Jeffrey's answer, you can select the last instance of the lowest raster value with the following:
r <- raster(ncols=12, nrows=12)
set.seed(0)
r[] <- round(runif(ncell(r))*0.7 )
rc <- clump(r)
rc[12,8]<-1
plot(rc)
xy.min<-data.frame(xyFromCell(rc,max(which.min(rc))))
xy.min$dat<-1
coordinates(xy.min)<-~x+y
points(xy.min,lwd=2)

Set single raster to NA where values of raster stack are NA

I have two 30m x 30m raster files which I would like to sample points from. Prior to sampling, I would like to remove the clouded areas from the images. I turned to R and Hijman's Raster package for the task.
Using the drawPoly(sp=TRUE) command, I drew in 18 different polygons. The function did not seem to allow 18 polygons as one sp object, so I drew them all separately. I then gave the polygons a proj4string matching the rasters', and set them into a list. I ran the list through a lapply function to convert them to rasters (rasterize function in Hijman's package) with the polygon areas set to NA, and the rest of the image set to 1.
My end goal is one raster layer with the 18 areas set to NA. I have tried stacking the list of rasterized polygons, and subsetting it to put set a new raster to NA in the same areas. My reproducible code is below.
library(raster)
r1 <- raster(nrow=50, ncol = 50)
r1[] <- 1
r1[4:10,] <- NA
r2 <- raster(nrow=50, ncol = 50)
r2[] <- 1
r2[9:15,] <- NA
r3 <- raster(nrow=50, ncol = 50)
r3[] <- 1
r3[24:39,] <- NA
r4 <- raster(nrow=50, ncol = 50)
r4[] <- 1
s <- stack(r1, r2, r3)
test.a.cool <- calc(s, function(x){r4[is.na(x)==1] <- NA})
For whatever reason, the darn testacool is a blank plot, where I'm aiming to have it as a raster with all values except for the NAs in the stack, s, equal to 1.
Any tips?
Thanks.
Doing sum(s) will work, as sum() returns NA for any grid cell with even one NA value in the stack.
To see that it works, compare the figures produced by the following:
plot(s)
plot(sum(s))
I posted this question on the R-Sig-Geo forum, as well, and received a response from the package author. The two simplest solutions:
Use the sp package to rbind my polygons into one, then rasterize the polygon.
p <- rbind(p1, p2, p3...etc., makeUniqueIDs = TRUE)
r4 <- raster(nrow=50, ncol = 50)
r4[] <- 1
mask <- rasterize(p, r4)
mask[mask %in% 1:18] <- 1
#The above code produces a single raster file with
#my polygons as unique values, ready for masking.
And the second simple solution, as just pointed out by Josh O'Brien:
m <- sum(s)
test <- mask(r4, m)
The R community rocks. Problem solved (twice) within an hour. Thanks.
I'm not familiar with the package you are using, however looking at the final line in your code, I think the issue might be here:
function(x){r4[is.na(x)==1] <- NA})
It doesn't look like calc will do much with that. It is setting the values of r4 indexed by the NAs of x and setting those to NA.
What then? If anything, maybe:
function(x){r4[is.na(x)==1] <- NA; return(r4) })
Although, it's not clear if that is even what you are after.
You were on the right track. The [ operator is defined for rasters and raster stacks, so you could just use the single line:
r4[ any(is.na(s) ) ] <- NA
plot(r4)
If you wanted to use calc you could have used it like this:
r4 <- calc( s, function(x){ ( ! any( is.na(x) ) ) } )
r4[is.na(r4)] <- NA
plot(r4)

Function for resizing matrices in R

I was wondering if there was a function that scales down matrices in R statistical software exactly like with image resizing. The function imresize() in MATLAB is exactly what I'm looking for (I believe it takes the average of the surrounding points, but I am not sure of this), but I am wondering if there is an R equivalent for this function.
This question has been posted before on this forum, but with reference to MATLAB, not R:
Matlab "Scale Down" a Vector with Averages
The post starting with "Any reason why you can't use the imresize() function?" is exactly what I am looking for, but in R, not MATLAB.
Say I have a latitude-longitude grid of temperatures around the world, and let's say this is represented by a 64*128 matrix of temperatures. Now let's say I would like to have the same data contained in a new matrix, but I would like to rescale my grid to make it a 71*114 matrix of temperatures around the world. A function that would allow me to do so is what I'm looking for (again, the imresize() function, but in R, not MATLAB)
Thank you.
Steve
One way to do this is by using the function resample(), from the raster package.
I'll first show how you could use it to rescale your grid, and then give an easier-to-inspect example of its application to smaller raster objects
Use resample() to resize matrices
library(raster)
m <- matrix(seq_len(68*128), nrow=68, ncol=128, byrow=TRUE)
## Convert matrix to a raster with geographical coordinates
r <- raster(m)
extent(r) <- extent(c(-180, 180, -90, 90))
## Create a raster with the desired dimensions, and resample into it
s <- raster(nrow=71, ncol=114)
s <- resample(r,s)
## Convert resampled raster back to a matrix
m2 <- as.matrix(s)
Visually confirm that resample() does what you'd like:
library(raster)
## Original data (4x4)
rr <- raster(ncol=4, nrow=4)
rr[] <- 1:16
## Resize to 5x5
ss <- raster(ncol=5, nrow=5)
ss <- resample(rr, ss)
## Resize to 3x3
tt <- raster(ncol=3, nrow=3)
tt <- resample(rr, tt)
## Plot for comparison
par(mfcol=c(2,2))
plot(rr, main="original data")
plot(ss, main="resampled to 5-by-5")
plot(tt, main="resampled to 3-by-3")
The answer posted by Josh O'Brien is OK and it helped me (for starting point), but this approach was too slow since I had huge list of data. The method below is good alternative. It uses fields and works much faster.
Functions
rescale <- function(x, newrange=range(x)){
xrange <- range(x)
mfac <- (newrange[2]-newrange[1])/(xrange[2]-xrange[1])
newrange[1]+(x-xrange[1])*mfac
}
ResizeMat <- function(mat, ndim=dim(mat)){
if(!require(fields)) stop("`fields` required.")
# input object
odim <- dim(mat)
obj <- list(x= 1:odim[1], y=1:odim[2], z= mat)
# output object
ans <- matrix(NA, nrow=ndim[1], ncol=ndim[2])
ndim <- dim(ans)
# rescaling
ncord <- as.matrix(expand.grid(seq_len(ndim[1]), seq_len(ndim[2])))
loc <- ncord
loc[,1] = rescale(ncord[,1], c(1,odim[1]))
loc[,2] = rescale(ncord[,2], c(1,odim[2]))
# interpolation
ans[ncord] <- interp.surface(obj, loc)
ans
}
Lets look how it works
## Original data (4x4)
rr <- matrix(1:16, ncol=4, nrow=4)
ss <- ResizeMat(rr, c(5,5))
tt <- ResizeMat(rr, c(3,3))
## Plot for comparison
par(mfcol=c(2,2), mar=c(1,1,2,1))
image(rr, main="original data", axes=FALSE)
image(ss, main="resampled to 5-by-5", axes=FALSE)
image(tt, main="resampled to 3-by-3", axes=FALSE)

Resources