Calculate minimum distance between multiple polygons with R - r

I'm still somewhat new to R and the sf package...
I have two sets of multipolygon data that I am trying to analyze. My first set of polygons (fires) contains hundreds of wildfire perimeters. The second set (towns) contains hundreds of urban areas boundaries.
For each fire, I would like to calculate the distance to the closest town (fire polygon edge to closest town polygon edge), and add that as a field to each fire.
So far I have mostly been using the sf package for spatial data. In my searches, I can only find minimum distance methods for polygons to points, points to points, lines to points, etc. but cannot seem to find polygon to polygon examples. Any help to send me in the right direction would be much appreciated! Thank you.

#TimSalabim Thank you for sending me in the right direction. I was able to accomplish what I was after. Maybe not the most elegant solution, but it worked.
# create an index of the nearest feature
index <- st_nearest_feature(x = poly1, y = poly2)
# slice based on the index
poly2 <- poly2 %>% slice(index)
# calculate distance between polygons
poly_dist <- st_distance(x = poly1, y= poly2, by_element = TRUE)
# add the distance calculations to the fire polygons
poly1$distance <- poly_dist

Related

Assigning covariate associated to spatial points to a bigger set of spatial points in R?

I have two data sets with spatial points (in .csv format): data1 with 220 spatial points with latitude and longitude and data2 with 80 spatial points with latitude and longitude. For data2 I have one covariate indicated the genetic origin of each points. Spatial points in both datasets are not exactly the same.
I would like to assign the genetic origin for spatial points in data1. It seems that I need to define around each point in data2 a square (or other) to be able to associate a genetic origin at each points in data1.
I am using R and I think packages as raster or sp may be useful.
Thanks for your help.
Best,
Marie.
You need to make your mind up about how you want to assign "genetic origin". One approach that seem to be hinting at is assigning it to its nearest neighbor.
When asking a question you should always include some example data.
library(raster)
d1 <- data.frame(lon=c(1,5,55,31), lat=c(3,7,20,22))
d2 <- data.frame(lon=c(4,2,8,65,5,4), lat=c(50,-90,20,32,10,10), origin=LETTERS[1:6], stringsAsFactors=FALSE)
Here is how you can assign origin based on the nearest known origin
# make sure your data are (x,y) or (longitude,latitude), not the reverse
pd <- pointDistance(d1, d2[,1:2], lonlat=TRUE)
nd <- apply(pd, 1, which.min)
d1$origin <- d2$origin[nd]

Create Grid in R for kriging in gstat

lat long
7.16 124.21
8.6 123.35
8.43 124.28
8.15 125.08
Consider these coordinates, these coordinates correspond to weather stations that measure rainfall data.
The intro to the gstat package in R uses the meuse dataset. At some point in this tutorial: https://rpubs.com/nabilabd/118172, the guys makes use of a "meuse.grid" in this line of code:
data("meuse.grid")
I do not have such a file and I do not know how to create it, can I create one using these coordinates? Or at least point me to material that discusses how to create a custom grid for a custom area (i.e not using administrative boundaries from GADM).
Probably wording this wrong, don't even know if this question makes sense to R savvy people. Still, would love to hear some direction, or at least tips. Thanks a lot!
Total noob at R and statistics.
EDIT: See the sample grid that the tutorial I posted looks like, that's the thing I want to make.
EDIT 2: Would this method be viable? https://rstudio-pubs-static.s3.amazonaws.com/46259_d328295794034414944deea60552a942.html
I am going to share my approach to create a grid for kriging. There are probably more efficient or elegant ways to achieve the same task, but I hope this will be a start to facilitate some discussions.
The original poster was thinking about 1 km for every 10 pixels, but that is probably too much. I am going to create a grid with cell size equals to 1 km * 1 km. In addition, the original poster did not specify an origin of the grid, so I will spend some time determining a good starting point. I also assume that the Spherical Mercator projection coordinate system is the appropriate choice for the projection. This is a common projection for Google Map or Open Street Maps.
1. Load Packages
I am going to use the following packages. sp, rgdal, and raster are packages provide many useful functions for spatial analysis. leaflet and mapview are packages for quick exploratory visualization of spatial data.
# Load packages
library(sp)
library(rgdal)
library(raster)
library(leaflet)
library(mapview)
2. Exploratory Visualization of the station locations
I created an interactive map to inspect the location of the four stations. Because the original poster provided the latitude and longitude of these four stations, I can create a SpatialPointsDataFrame with Latitude/Longitude projection. Notice the EPSG code for Latitude/Longitude projection is 4326. To learn more about EPSG code, please see this tutorial (https://www.nceas.ucsb.edu/~frazier/RSpatialGuides/OverviewCoordinateReferenceSystems.pdf).
# Create a data frame showing the **Latitude/Longitude**
station <- data.frame(lat = c(7.16, 8.6, 8.43, 8.15),
long = c(124.21, 123.35, 124.28, 125.08),
station = 1:4)
# Convert to SpatialPointsDataFrame
coordinates(station) <- ~long + lat
# Set the projection. They were latitude and longitude, so use WGS84 long-lat projection
proj4string(station) <- CRS("+init=epsg:4326")
# View the station location using the mapview function
mapview(station)
The mapview function will create an interactive map. We can use this map to determine what could be a suitable for the origin of the grid.
3. Determine the origin
After inspecting the map, I decided that the origin could be around longitude 123 and latitude 7. This origin will be on the lower left of the grid. Now I need to find the coordinate representing the same point under Spherical Mercator projection.
# Set the origin
ori <- SpatialPoints(cbind(123, 7), proj4string = CRS("+init=epsg:4326"))
# Convert the projection of ori
# Use EPSG: 3857 (Spherical Mercator)
ori_t <- spTransform(ori, CRSobj = CRS("+init=epsg:3857"))
I first created a SpatialPoints object based on the latitude and longitude of the origin. After that I used the spTransform to perform project transformation. The object ori_t now is the origin with Spherical Mercator projection. Notice that the EPSG code for Spherical Mercator is 3857.
To see the value of coordinates, we can use the coordinates function as follows.
coordinates(ori_t)
coords.x1 coords.x2
[1,] 13692297 781182.2
4. Determine the extent of the grid
Now I need to decide the extent of the grid that can cover all the four points and the desired area for kriging, which depends on the cell size and the number of cells. The following code sets up the extent based on the information. I have decided that the cell size is 1 km * 1 km, but I need to experiment on what would be a good cell number for both x- and y-direction.
# The origin has been rounded to the nearest 100
x_ori <- round(coordinates(ori_t)[1, 1]/100) * 100
y_ori <- round(coordinates(ori_t)[1, 2]/100) * 100
# Define how many cells for x and y axis
x_cell <- 250
y_cell <- 200
# Define the resolution to be 1000 meters
cell_size <- 1000
# Create the extent
ext <- extent(x_ori, x_ori + (x_cell * cell_size), y_ori, y_ori + (y_cell * cell_size))
Based on the extent I created, I can create a raster layer with number all equal to 0. Then I can use the mapview function again to see if the raster and the four stations matches well.
# Initialize a raster layer
ras <- raster(ext)
# Set the resolution to be
res(ras) <- c(cell_size, cell_size)
ras[] <- 0
# Project the raster
projection(ras) <- CRS("+init=epsg:3857")
# Create interactive map
mapview(station) + mapview(ras)
I repeated this process several times. Finally I decided that the number of cells is 250 and 200 for x- and y-direction, respectively.
5. Create spatial grid
Now I have created a raster layer with proper extent. I can first save this raster as a GeoTiff for future use.
# Save the raster layer
writeRaster(ras, filename = "ras.tif", format="GTiff")
Finally, to use the kriging functions from the package gstat, I need to convert the raster to SpatialPixels.
# Convert to spatial pixel
st_grid <- rasterToPoints(ras, spatial = TRUE)
gridded(st_grid) <- TRUE
st_grid <- as(st_grid, "SpatialPixels")
The st_grid is a SpatialPixels that can be used in kriging.
This is an iterative process to determine a suitable grid. Throughout the process, users can change the projection, origin, cell size, or cell number depends on the needs of their analysis.
#yzw and #Edzer bring up good points for creating a regular rectangular grid, but sometimes, there is the need to create an irregular grid over a defined polygon, usually for kriging.
This is a sparsely documented topic. One good answer can be found here. I expand on it with code below:
Consider the the built in meuse dataset. meuse.grid is an irregularly shaped grid. How do we make an grid like meuse.grid for our unique study area?
library(sp)
data(meuse.grid)
ggplot(data = meuse.grid) + geom_point(aes(x, y))
Imagine an irregularly shaped SpatialPolygon or SpatialPolygonsDataFrame, called spdf. You first build a regular rectangular grid over it, then subset the points in that regular grid by the irregularly-shaped polygon.
# First, make a rectangular grid over your `SpatialPolygonsDataFrame`
grd <- makegrid(spdf, n = 100)
colnames(grd) <- c("x", "y")
# Next, convert the grid to `SpatialPoints` and subset these points by the polygon.
grd_pts <- SpatialPoints(
coords = grd,
proj4string = CRS(proj4string(spdf))
)
# subset all points in `grd_pts` that fall within `spdf`
grd_pts_in <- grd_pts[spdf, ]
# Then, visualize your clipped grid which can be used for kriging
ggplot(as.data.frame(coordinates(grd_pts_in))) +
geom_point(aes(x, y))
If you have your study area as a polygon, imported as a SpatialPolygons, you could either use package raster to rasterize it, or use sp::spsample to sample it using sampling type regular.
If you don't have such a polygon, you can create points regularly spread over a rectangular long/lat area using expand.grid, using seq to generate a sequence of long and lat values.

How to get count of non-NA raster cells within polygon

I've been running into all sorts of issues using ArcGIS ZonalStats and thought R could be a great way. Saying that I'm fairly new to R, but got a coding background.
The situation is that I have several rasters and a polygon shape file with many features of different sizes (though all features are bigger than a raster cell and the polygon features are aligned to the raster).
I've figured out how to get the mean value for each polygon feature using the raster library with extract:
#load packages required
require(rgdal)
require(sp)
require(raster)
require(maptools)
# ---Set the working directory-------
datdir <- "/test_data/"
#Read in a ESRI grid of water depth
ras <- readGDAL("test_data/raster/pl_sm_rp1000/w001001.adf")
#convert it to a format recognizable by the raster package
ras <- raster(ras)
#read in polygon shape file
proxNA <- readShapePoly("test_data/proxy/PL_proxy_WD_NA_test")
#plot raster and shp
plot(ras)
plot(proxNA)
#calc mean depth per polygon feature
#unweighted - only assigns grid to district if centroid is in that district
proxNA#data$RP1000 <- extract(ras, proxNA, fun = mean, na.rm = TRUE, weights = FALSE)
#check results
head(proxNA)
#plot depth values
spplot(proxNA[,'RP1000'])
The issue I have is that I also need an area based ratio between the area of the polygon and all non NA cells in the same polygon. I know what the cell size of the raster is and I can get the area for each polygon, but the missing link is the count of all non-NA cells in each feature. I managed to get the cell number of all the cells in the polygon proxNA#data$Cnumb1000 <- cellFromPolygon(ras, proxNA)and I'm sure there is a way to get the actual value of the raster cell, which then requires a loop to get the number of all non-NA cells combined with a count, etc.
BUT, I'm sure there is a much better and quicker way to do that! If any of you has an idea or can point me in the right direction, I would be very grateful!
I do not have access to your files, but based on what you described, this should work:
library(raster)
mask_layer=shapefile(paste0(shapedir,"AOI.shp"))
original_raster=raster(paste0(template_raster_dir,"temp_raster_DecDeg250.tif"))
nonNA_raster=!is.na(original_raster)
masked_img=mask(nonNA_raster,mask_layer) #based on centroid location of cells
nonNA_count=cellStats(masked_img, sum)

sp::over() for point in polygon analysis

I have a shapefile named "ind_adm" and a SpatialPointsDataFrame called "pnts". The "pnts" contains points generated at random, and some of the points overlap with the polygon. See picture below.
Now, I want do do a point in polygon analysis, i.e. I want to find out which points lie inside the gray polygon representing the boundary of India. For this I am using the over() function in the sp library.
pt.in.poly <- sp::over(ind_adm, pnts, fn = mean) #do the join
However, the output I am getting is
>pt.in.poly
values
0 6.019467
I should actually get the index of the points that are "in" the polygon.
Where am I going wrong?
Found this concise and intuitive syntax for over:
pnts[ind_adm,]
from this Intro document
You should not supply a function. You are aggregating the attribute values of your points over the geometry of the polygon, (i.e. the number returned is the mean of the attribute of the points that fall within the polygon). In addition you have your x and y the wrong way round for what you want to do. Should be...
over( pnts , ind_adm , fn = NULL)
You can use point.in.poly fom spatialEco package. It "intersects point and polygon feature classes and adds polygon attributes to points".
library(spatialEco)
new_shape <- point.in.poly(pnts, ind_adm)
You could also use the st_intersection function from the sf package:
Load the library
library(sf)
Create a simple feature geometry (polygon) from your polygon
ind_adm <- st_as_sf(ind_adm)
Create a simple feature geometry (point) from your points of interest
(24047 is the EPSG code for India)
pnts <- st_as_sf(pnts) %>% st_set_crs(., 24047)
Keep only the points inside the polygon
kept_points <- st_intersection(ind_adm, pnts)

Intersecting Points and Polygons in R

I am working with shapefiles in R, one is point.shp the other is a polygon.shp.
Now, I would like to intersect the points with the polygon, meaning that all the values from the polygon should be attached to the table of the point.shp.
I tried overlay() and spRbind in package sp, but nothing did what I expected them to do.
Could anyone give me a hint?
With the new sf package this is now fast and easy:
library(sf)
out <- st_intersection(points, poly)
Additional options
If you do not want all fields from the polygon added to the point feature, just call dplyr::select() on the polygon feature before:
library(magrittr)
library(dplyr)
library(sf)
poly %>%
select(column-name1, column-name2, etc.) -> poly
out <- st_intersection(points, poly)
If you encounter issues, make sure that your polygon is valid:
st_is_valid(poly)
If you see some FALSE outputs here, try to make it valid:
poly <- st_make_valid(poly)
Note that these 'valid' functions depend on a sf installation compiled with liblwgeom.
If you do overlay(pts, polys) where pts is a SpatialPointsDataFrame object and polys is a SpatialPolygonsDataFrame object then you get back a vector the same length as the points giving the row of the polygons data frame. So all you then need to do to combine the polygon data onto the points data frame is:
o = overlay(pts, polys)
pts#data = cbind(pts#data, polys[o,])
HOWEVER! If any of your points fall outside all your polygons, then overlay returns an NA, which will cause polys[o,] to fail, so either make sure all your points are inside polygons or you'll have to think of another way to assign values for points outside the polygon...
You do this in one line with point.in.poly fom spatialEco package.
library(spatialEco)
new_shape <- point.in.poly(pts, polys)
from the documentation: point.in.poly "intersects point and polygon feature classes and adds polygon attributes to points".

Resources