I am having some issues with st_intersection. I am trying to intersect polygons from bus stop buffers in preparation for areal interpolation. Here are the data: Here is the data: https://realtime.commuterpage.com/rtt/public/utility/gtfs.aspx
Here is my code:
ART2019Path <- file.path(GTFS_path, "2019-10_Arlington.zip")
ART2019GTFS <- read_gtfs(ART2019Path)
ART2019StopLoc <- stops_as_sf(ART2019GTFS$stops) ### Make a spatial file for stops
ART2019Buffer <- st_buffer(ART2019StopLoc, dist = 121.92) ### Make buffer with 400ft (121.92m) radius
It creates something that looks like the image below (created using mapview); as you can see, there are multiple overlapping buffers.
I tried intersecting the polygons using the following:
BufferIntersect <- st_intersection(ART2019Buffer, ART2019Buffer)
BufferIntersect <- st_make_valid(BufferIntersect) ### Fix some of the polygons that didn't quite work
But it only intersects two layers of polygons, meaning there is still overlap. How do I make all buffers intersect?
I have looked at similar questions on here like this: Loop to check multiple polygons overlap in r SF package
But there is no answer.
One of the comments suggested the following links:
https://r-spatial.org/r/2017/12/21/geoms.html
https://r-spatial.github.io/sf/reference/geos_binary_ops.html#details-1
But I can't get either to work. Any help would be greatly appreciated.
Edit
Couple of clarifying points in response to some comments.
I am interested in the area of each unique polygon within bus stop buffers as I will be using these polygons in an areal interpolation with census data to estimate population with access to bus stops
400ft walking distance is standard practice for bus stop accessibility
It sounds like you just want the buffer(s), but without having to deal with all of the overlapping sections. It doesn't matter if a person is within 400ft of one bus-stop or three, right?
If so, you can use the st_union function to "blend" the buffers together.
library(tidytransit)
library(sf)
library(mapview)
library(ggplot2)
# s2 true allows buffering in meters, s2 off later speeds things up
sf::sf_use_s2(TRUE)
ART2019Path <- file.path("/your/file/path/")
ART2019GTFS <- read_gtfs(ART2019Path)
ART2019StopLoc <- stops_as_sf(ART2019GTFS$stops) ### Make a spatial file for stops
ART2019Buffer <- st_buffer(ART2019StopLoc, dist = 121.92) ### Make buffer with 400ft (121.92m) radius
# might be needed due to some strange geometries in buffer, and increase speed
sf::sf_use_s2(FALSE)
#> Spherical geometry (s2) switched off
# MULTIPOLYGON sfc object covering only the buffered areas,
# there are no 'overlaps'.
buff_union <- st_union(st_geometry(ART2019Buffer))
#> although coordinates are longitude/latitude, st_union assumes that they are planar
buff_union
#> Geometry set for 1 feature
#> Geometry type: MULTIPOLYGON
#> Dimension: XY
#> Bounding box: xmin: -77.16368 ymin: 38.83828 xmax: -77.04768 ymax: 38.9263
#> Geodetic CRS: WGS 84
#> MULTIPOLYGON (((-77.08604 38.83897, -77.08604 3...
# Non-overlapping buffer & stops
ggplot() +
geom_sf(data = buff_union, fill = 'blue', alpha = .4) +
geom_sf(data = ART2019StopLoc, color = 'black') +
coord_sf(xlim = c(-77.09, -77.07),
ylim = c(38.885, 38.9))
# Overlapping buffer & stops
ggplot() +
geom_sf(data = ART2019Buffer, fill = 'blue', alpha = .4) +
geom_sf(data = ART2019StopLoc, color = 'black') +
coord_sf(xlim = c(-77.09, -77.07),
ylim = c(38.885, 38.9))
# Back to original settings
sf::sf_use_s2(TRUE)
#> Spherical geometry (s2) switched on
Created on 2022-04-18 by the reprex package (v2.0.1)
Something like this works for me, albeit a bit slowly. Here I loop through each stop buffer and run the intersection process on an object containing all other stop buffers excluding that stop buffer.
library(sf)
library(tidyverse)
df<-read.csv("YOUR_PATH/google_transit/stops.txt")
# Read data
ART2019StopLoc <- st_as_sf(df, coords=c('stop_lon', 'stop_lat'))
ART2019StopLoc <- st_set_crs(ART2019StopLoc, value=4326)
# Make buffer
ART2019Buffer <- st_buffer(ART2019StopLoc, dist=121.92)
# Create empty data frame to store results
results <- data.frame()
# Loop through each stop and intersect with other stops
for(i in 1:nrow(ART2019Buffer)) {
# Subset to stop of interest
stop <- ART2019Buffer[i,]
# Subset to all other stops excl. stop of interest
stop_check <- ART2019Buffer[-i,]
# Intersect and make valid
stop_intersect <- st_intersection(stop, stop_check) %>%
st_make_valid()
# Create one intersected polygon
stop_intersect <- st_combine(stop_intersect) %>%
st_as_sf() %>%
mutate(stop_name=stop$stop_name)
# Combine into one results object
results <- rbind(results, stop_intersect)
print(i)
}
ggplot() +
geom_sf(data=ART2019Buffer %>% filter(stop_name %in% results$stop_name),
fill='gray70') +
geom_sf(data=results, aes(fill=stop_name), alpha=0.5)
The plot below shows the results for the first 8 stops. The gray circles are the original stop buffers and the colored buffers show the intersection with adjacent buffers.
Related
I want to make an equal area grid (400 square miles per grid cell) over Wisconsin. I am doing this using the code from this link: Creating an equal distance spatial grid in R.
But, this code isn't very flexible, and I also need the grid to be more than just polygons. I need it to be a shapefile. I like the Terra package, but am unable to figure out how to do this in the terra package. The WI shapefile can be downloaded from https://data-wi-dnr.opendata.arcgis.com/datasets/wi-dnr::wisconsin-state-boundary-24k/explore.
My code looks like this:
library(sf)
library(terra)
library(tidyverse)
wi_shape <- vect('C:\\Users\\ruben\\Downloads\\Wisconsin_State_Boundary_24K\\Wisconsin_State_Boundary_24K.shp')
plot(wi_shape)
wi_grid <- st_make_grid(wi_shape, square = T, cellsize = c(20 * 1609.344, 20 * 1609.344))
plot(wi_grid, add = T)
How do I define a grid that is centered on a lat/lon point, where the output is a shapefile that contains attributes for each grid cell? I'm not sure why this is so confusing to me. Thank you.
If your goal is to make a raster based on the extent of another spatial dataset (polygons in this case) you can do
library(terra)
wi <- vect('Wisconsin_State_Boundary_24K.shp')
r <- rast(wi, res=(20 * 1609.344))
You can turn these into polygons and write them to a file with
v <- as.polygons(r)
writeVector(v, "test.shp")
To define a lon/lat center for the grid, you could do the following.
Coordinates of an example lon/lat point projected to the crs of your polygons (Wisconsin Transverse Mercator).
center <- cbind(-90, 45) |> vect(crs="+proj=longlat")
cprj <- crds(project(center, wi))
res <- 20 * 1609.344
Create a single cells around that point and expand the raster:
e <- rep(cprj, each=2) + c(-res, res) / 2
x <- rast(ext(e), crs=crs(wi), ncol=1, nrow=1)
x <- extend(x, wi, snap="out")
The result
plot(as.polygons(x), border="blue")
lines(wi, col="red")
points(cprj, pch="x", cex=2)
I should also mention that you are not using an equal-area coordinate reference system. You can see the variation in cell sizes with
a <- cellSize(x)
But it is very small (less than 1%) relative to the average cell size
diff(minmax(a))
# area
#max 1690441
global(a, mean)
# mean
#area 1036257046
Let's try to tidy this a little bit.
[...] and I also need the grid to be more than just polygons. I need it to be a shapefile.
It's exactly the other way around from my point of view. Once you obtained a proper representation of a polygon, you can export it in whatever format you like (which is supported), e.g. an ESRI Shapefile.
I like the Terra package, but am unable to figure out how to do this in the terra package.
Maybe you did not notice, but actually you are not really using {terra} to create your grid, but {sf} (with SpatVector input from terra, which is accepted here).
library(sf)
#> Linking to GEOS 3.9.3, GDAL 3.5.2, PROJ 8.2.1; sf_use_s2() is TRUE
library(terra)
#> terra 1.6.33
wi_shape <- vect('Wisconsin_State_Boundary_24K.shp')
class(wi_shape)
#> [1] "SpatVector"
#> attr(,"package")
#> [1] "terra"
wi_grid <- st_make_grid(wi_shape, square = T, cellsize = c(20 * 1609.344, 20 * 1609.344))
class(wi_grid)
#> [1] "sfc_POLYGON" "sfc"
It's a minor adjustment, but basically, you can cut this dependency here for now. Also - although I'm not sure is this is the type of flexibility you are looking for - I found it very pleasing to work with {units} recently if you are about to do some conversion stuff like square miles in meters. In the end, once your code is running properly, you can substitute your hardcoded values by variables step by step and wrap a function out of this. This should not be a big deal in the end.
In order to shift your grid to be centered on a specific lat/lon point, you can leverage the offset attribute of st_make_grid(). However, since this only shifts the grid based on the original extent, you might lose coverage with this approach:
library(sf)
#> Linking to GEOS 3.9.3, GDAL 3.5.2, PROJ 8.2.1; sf_use_s2() is TRUE
wi_shape <- read_sf("Wisconsin_State_Boundary_24K.shp")
# area of 400 square miles
A <- units::as_units(400, "mi^2")
# boundary length in square meters to fit the metric projection
b <- sqrt(A)
units(b) <- "m"
# let's assume you wanted your grid to be centered on 45.5° N / 89.5° W
p <- c(-89.5, 45.5) |>
st_point() |>
st_sfc(crs = "epsg:4326") |>
st_transform("epsg:3071") |>
st_coordinates()
p
#> X Y
#> 1 559063.9 558617.2
# create an initial grid for centroid determination
wi_grid <- st_make_grid(wi_shape, cellsize = c(b, b), square = TRUE)
# determine the centroid of your grid created
wi_grid_centroid <- wi_grid |>
st_union() |>
st_centroid() |>
st_coordinates()
wi_grid_centroid
#> X Y
#> 1 536240.6 482603.9
# this should be your vector of displacement, expressed as the difference
delta <- wi_grid_centroid - p
delta
#> X Y
#> 1 -22823.31 -76013.3
# `st_make_grid(offset = ...)` requires lower left corner coordinates (x, y) of the grid,
# so you need some extent information which you can acquire via `st_bbox()`
bbox <- st_bbox(wi_grid)
# compute the adjusted lower left corner
llc_new <- c(st_bbox(wi_grid)["xmin"] + delta[1], st_bbox(wi_grid)["ymin"] + delta[2])
# create your grid with an offset
wi_grid_offset <- st_make_grid(wi_shape, cellsize = c(b, b), square = TRUE, offset = llc_new) |>
st_as_sf()
# append attributes
n <- dim(wi_grid_offset)[1]
wi_grid_offset[["id"]] <- paste0("A", 1:n)
wi_grid_offset[["area"]] <- st_area(wi_grid_offset) |> as.numeric()
# inspect
plot(st_geometry(wi_shape))
plot(st_geometry(wi_grid_offset), border = "red", add = TRUE)
If you wanted to export your polygon features ("grid") in shapefile format, simply make use of st_write(wi_grid_sf, "wi_grid_sf.shp").
PS: For this example you need none of the tidyverse stuff, so there is no need to load it.
I'm trying to check whether two polygons intersect in R. When plotting, they clearly do not. When checking the intersection, rgeos::gIntersects() currently returns FALSE, while sf::intersects() returns TRUE. I imagine this is due to the polygons being (1) large and (2) close together, so something about when on a flat surface they don't appear to intersect but on a sphere they would appear to intersect?
Ideally I could keep my workflow all in sf -- but I'm wondering if there's a way to use sf::intersects() (or another sf function) that will return FALSE here?
Here's an example:
library(sf)
library(rgeos)
library(leaflet)
library(leaflet.extras)
#### Make Polygons
poly_1 <- c(xmin = -124.75961, ymin = 49.53330, xmax = -113.77328, ymax = 56.15249) %>%
st_bbox() %>%
st_as_sfc()
st_crs(poly_1) <- 4326
poly_2 <- c(xmin = -124.73214, ymin = 25.11625, xmax = -66.94889, ymax = 49.38330) %>%
st_bbox() %>%
st_as_sfc()
st_crs(poly_2) <- 4326
#### Plotting
# Visually, the polygons do not intersect
leaflet() %>%
addTiles() %>%
addPolygons(data = poly_1) %>%
addPolygons(data = poly_2)
#### Check Intersection
# returns FALSE
gIntersects(poly_1 %>% as("Spatial"),
poly_2 %>% as("Spatial"))
# returns TRUE
st_intersects(poly_1,
poly_2,
sparse = F)
Here's the polygons, which visually do not intersect.
This is an interesting problem, with the root cause being difference between planar (on a flat surface) and spherical (on a globe) geometry.
On a plane - which is a simplified approach that GEOS takes - the four corners of a polygon are connected by four straight lines, the sum of angles is 360° degrees etc. Geometry works as Euclid taught ages ago.
But, and this is crucial, this is not how the world works. On a globe the four connections of polygon are not straight lines but great circles. Or rather they are straight when drawn on a globe, and curved when rolled flat one a planar surface (such as a map or your computer screen).
Because an example is more than a 1000 words consider this piece of code:
library(sf)
library(dplyr)
# make polygons
poly_1 <- c(xmin = -124.75961, ymin = 49.53330, xmax = -113.77328, ymax = 56.15249) %>%
st_bbox() %>%
st_as_sfc()
st_crs(poly_1) <- 4326
poly_2 <- c(xmin = -124.73214, ymin = 25.11625, xmax = -66.94889, ymax = 49.38330) %>%
st_bbox() %>%
st_as_sfc()
st_crs(poly_2) <- 4326
# this is what you *think* you see (and what GEOS sees, being planar)
# = four corners connected by straight lines
# & no intersecton
mapview::mapview(list(poly_1, poly_2))
# this is what you *should* see (and what {s2} sees, being spherical)
# = four corners connected by great circles
# & an obvious intersection around the lower right corner of the small polygon
poly_1alt <- st_segmentize(poly_1, units::set_units(1, degree))
poly_2alt <- st_segmentize(poly_2, units::set_units(1, degree))
mapview::mapview(list(poly_1alt, poly_2alt))
You have two options:
accept that your thinking about polygons was wrong, and embrace spherical, i.e. {s2} logic.
This should be in theory the correct approach, but somewhat counter intuitive.
make {sf} abandon spherical approach to polygons, and force it to apply planar approach (such as GEOS uses).
This would be in theory a wrong approach, but consistent both with your planar intuition and previous behaviour of most GIS tools, including {sf} prior to version 1.0.0
# remedy (of a kind...) = force planar geometry operations
sf::sf_use_s2(FALSE)
st_intersects(poly_1, poly_2, sparse = F)
# [,1]
# [1,] FALSE
Using the R package sf, I'm trying to determine whether some points occur within the bounds of a shapefile (in this case, Hawai‘i's, EEZ). The shapefile in question can be found here. Unfortunately, the boundaries of the area in question span +/-180 longitude, which I think is what's messing me up. (I read on the sf website some business about spherical geometry in the new version, but I haven't been able to get that version to install. I think the polygons I'm dealing with are sufficiently "flat" to avoid any of those issues anyway). Part of the issue seems to be that my shapefile contains multiple geometries broken up by the dateline but I'm not sure how to combine them.
How do you tell, using sf, whether some points are inside of the bounds of some object in a shapefile (that happens to span the dateline)?
I have tried various combinations of st_shift_longitude to no avail. I have also tried transforming to what I think is a planar projection (2163), and that didn't work.
Here's how I'm currently trying to do this:
library(sf)
library(maps)
library(ggplot2)
library(tidyverse)
# this is the shapefile from the link above
eez_unshifted <- read_sf("USMaritimeLimitsAndBoundariesSHP/USMaritimeLimitsNBoundaries.shp") %>%
filter(OBJECTID == 1206) %>%
st_transform(4326)
eez_shifted <- read_sf("USMaritimeLimitsAndBoundariesSHP/USMaritimeLimitsNBoundaries.shp") %>%
filter(OBJECTID == 1206) %>%
st_transform(4326) %>%
st_shift_longitude()
# four points, in and out of the geometry, on either side of the dateline
pnts <- tibble(x=c(-171.952474,176.251978,179.006220,-167.922929),y=c(25.561970,17.442716,28.463375,15.991429)) %>%
st_as_sf(coords=c('x','y'),crs=st_crs(eez_unshifted))
# these all return false for every point
st_within(pnts,eez_unshifted)
st_within(st_shift_longitude(pnts),eez_unshifted)
st_within(pnts,eez_shifted)
st_within(st_shift_longitude(pnts),eez_shifted)
# these also all return false for every point
st_intersects(pnts,eez_unshifted)
st_intersects(st_shift_longitude(pnts),eez_unshifted)
st_intersects(pnts,eez_shifted)
st_intersects(st_shift_longitude(pnts),eez_shifted)
# plot the data just to show that it looks right
wrld2 <- st_as_sf(maps::map('world2', plot=F, fill=T))
ggplot() +
geom_sf(data=wrld2, fill='gray20',color="lightgrey",size=0.07) +
geom_sf(data=eez_shifted) +
geom_sf(data=st_shift_longitude(pnts)) +
coord_sf(xlim=c(100,290), ylim=c(-60,60)) +
xlab("Longitude") +
ylab("Latitude")
The answer is to make sure the geometry you're checking against is a polygon:
> eez_poly <- st_polygonize(eez_shifted)
> st_within(pnts,eez_poly)
although coordinates are longitude/latitude, st_within assumes that they are planar
Sparse geometry binary predicate list of length 4, where the predicate was `within'
1: 1
2: (empty)
3: 1
4: (empty)
Rookie R user here and I would greatly appreciate any help you someone could give me.
My project requires me to create a vector boundary box around a city of my choice and then filter a lot of data so I only have the data relative to the area. However, it is several years since I have used R studio and its fair to say I remember little to nothing about the language.
I have initially used
geocode("Hereford, UK")
bbox <-c(Longitude=-2.72,Latitude=52.1)
myMap <- get_map(location = "Hereford, UK",source="google",maptype="roadmap")
I then must create a new tibble which filters out and gives only the relevant data to the area.
I am unsure how to proceed with this and I then must overlay the data onto the map which I have created.
As I only have a centre point of coordinates, is it possible to create a circle with a radius of say 3 miles around the centre of my location so I can then filter this area?
Thank you all for taking the time to read my post. Cheers!
Most spatial work can now be done pretty easily using the sf package.
Example code for a similar problem is below. The comments explain most of what it does.
The difficult part may be in understanding map projections (the crs). Some use units(meters, feet, etc) and others use latitude / longitude. Which one you choose depends on what area of the globe you're working with and what you're trying to accomplish. Most web mapping uses crs 4326, but that does not include an easily usable distance measurement.
The map below shows points outside ~3 miles from Hereford as red, and those inside in dark maroon. The blue point is used as the center for Hereford & the buffer zone.
library(tidyverse)
library(sf)
#> Linking to GEOS 3.6.2, GDAL 2.2.3, PROJ 4.9.3
library(mapview)
set.seed(4)
#hereford approx location, ggmap requires api key
hereford <- data.frame(place = 'hereford', lat = -2.7160, lon = 52.0564) %>%
st_as_sf(coords = c('lat', 'lon')) %>% st_set_crs(4326)
#simulation of data points near-ish hereford
random_points <- data.frame(point_num = 1:20,
lat = runif(20, min = -2.8, max = -2.6),
lon = runif(20, min = 52, max = 52.1)) %>%
st_as_sf(coords = c('lat', 'lon')) %>% st_set_crs(4326) %>%st_transform(27700)
#make a buffer of ~3miles (4800m) around hereford
h_buffer <- hereford %>% st_transform(27700) %>% #change crs to one measured in meters
st_buffer(4800)
#only points inside ~3mi buffer
points_within <- random_points[st_within( random_points, h_buffer, sparse = F), ]
head(points_within)
#> Simple feature collection with 6 features and 1 field
#> geometry type: POINT
#> dimension: XY
#> bbox: xmin: 346243.2 ymin: 239070.3 xmax: 355169.8 ymax: 243011.4
#> CRS: EPSG:27700
#> point_num geometry
#> 1 1 POINT (353293.1 241673.9)
#> 3 3 POINT (349265.8 239397)
#> 4 4 POINT (349039.5 239217.7)
#> 6 6 POINT (348846.1 243011.4)
#> 7 7 POINT (355169.8 239070.3)
#> 10 10 POINT (346243.2 239690.3)
#shown in mapview
mapview(hereford, color = 'blue') +
mapview(random_points, color = 'red', legend = F, col.regions = 'red') +
mapview(h_buffer, legend = F) +
mapview(points_within, color = 'black', legend = F, col.regions = 'black')
Created on 2020-04-12 by the reprex package (v0.3.0)
I have a polygon (zones) and a set of coordinates (points). I'd like to create a spatial kernal density raster for the entire polygon and extract the sum of the density by zone. Points outside of the polygon should be discarded.
library(raster)
library(tidyverse)
library(sf)
library(spatstat)
library(maptools)
load(url("https://www.dropbox.com/s/iv1s5butsx2v01r/example.RData?dl=1"))
# alternatively, links to gists for each object
# https://gist.github.com/ericpgreen/d80665d22dfa1c05607e75b8d2163b84
# https://gist.github.com/ericpgreen/7f4d3cee3eb5efed5486f7f713306e96
ggplot() +
geom_sf(data = zones) +
geom_sf(data = points) +
theme_minimal()
I tried converting to ppp with {spatstat} and then using density(), but I'm confused by the units in the result. I believe the problem is related to the units of the map, but I'm not sure how to proceed.
Update
Here's the code to reproduce the density map I created:
zones_owin <- as.owin(as_Spatial(zones))
pts <- st_coordinates(points)
p <- ppp(pts[,1], pts[,2], window=zones_owin, unitname=c("metre","metres"))
ds <- density(p)
r <- raster(ds)
plot(r)
Units are difficult when you work directly with geographic coordinates (lon, lat). If possible you should convert to planar coordinates (which is a requirement for spatstat) and proceed from there. The planar coordinates would typically be in units of meters, but I guess it depends on the specific projection and underlying ellipsoid etc. You can see this answer for how to project to planar coordinates with sf and export to spatstat format using maptools. Note: You have to manually choose a sensible projection (you can use http://epsg.io to find one) and you have to project both the polygon and the points.
Once everything is in spatstat format you can use density.ppp to do kernel smoothing. The resulting grid values (object of class im) are intensities of points, i.e., number of points per square unit (e.g. square meter). If you want to aggregate over some region you can use integral.im(..., domain = ...) to get the expected number of points in this region for a point process model with the given intensity.
I'm not sure if this answers all of your question, but should be a good start. Clarify in a comment or in your question should you need a different type of output.
It removes all points that are not inside one of the 'zone' polygons, counts them by zone and plots the zones colored by the number of points that fall within.
library(raster)
library(tidyverse)
library(sf)
#> Linking to GEOS 3.6.2, GDAL 2.2.3, PROJ 4.9.3
library(spatstat)
library(maptools)
#> Checking rgeos availability: TRUE
load(url("https://www.dropbox.com/s/iv1s5butsx2v01r/example.RData?dl=1"))
# alternatively, links to gists for each object
# https://gist.github.com/ericpgreen/d80665d22dfa1c05607e75b8d2163b84
# https://gist.github.com/ericpgreen/7f4d3cee3eb5efed5486f7f713306e96
p1 <- ggplot() +
geom_sf(data = zones) +
geom_sf(data = points) +
theme_minimal()
#Remove points outside of zones
points_inside <- st_intersection(points, zones)
#> although coordinates are longitude/latitude, st_intersection assumes that they are planar
#> Warning: attribute variables are assumed to be spatially constant throughout all
#> geometries
nrow(points)
#> [1] 308
nrow(points_inside)
#> [1] 201
p2 <- ggplot() +
geom_sf(data = zones) +
geom_sf(data = points_inside)
points_per_zone <- st_join(zones, points_inside) %>%
count(LocationID.x)
#> although coordinates are longitude/latitude, st_intersects assumes that they are planar
p3 <- ggplot() +
geom_sf(data = points_per_zone,
aes(fill = n)) +
scale_fill_viridis_c(option = 'C')
points_per_zone
#> Simple feature collection with 4 features and 2 fields
#> geometry type: POLYGON
#> dimension: XY
#> bbox: xmin: 34.0401 ymin: -1.076718 xmax: 34.17818 ymax: -0.9755066
#> epsg (SRID): 4326
#> proj4string: +proj=longlat +ellps=WGS84 +no_defs
#> # A tibble: 4 x 3
#> LocationID.x n geometry
#> * <dbl> <int> <POLYGON [°]>
#> 1 10 129 ((34.08018 -0.9755066, 34.0803 -0.9757393, 34.08046 -0.975…
#> 2 20 19 ((34.05622 -0.9959458, 34.05642 -0.9960835, 34.05665 -0.99…
#> 3 30 29 ((34.12994 -1.026372, 34.12994 -1.026512, 34.12988 -1.0266…
#> 4 40 24 ((34.11962 -1.001829, 34.11956 -1.002018, 34.11966 -1.0020…
cowplot::plot_grid(p1, p2, p3, nrow = 2, ncol = 2)
It seems I underestimated the difficulty of your problem. Is something like the plot below (& underlying data) what you're looking for?
It uses raster with ~50x50 grid, raster::focal with a window of 9x9 using the mean to interpolate the data.