I am struggling with gCentroid, because it doesn't seem -- to me -- to give the 'right' answer near a pole of the Earth.
For instance:
library(rgeos)
gCentroid(SpatialPoints(coords=data.frame(longitude=c(-135,-45,45,135),latitute=c(80,80,80,80)),proj4string = CRS('EPSG:4326')))
does not give me the North Pole, it gives:
> SpatialPoints:
> x y
> 1 0 80
> Coordinate Reference System (CRS) arguments: +proj=longlat +datum=WGS84 +no_defs
How do I get gCentroid to work on the surface of the Earth?
The GEOS library is limited to planar geometry operations; this can bring issues in edge cases / the poles being a notorious example.
For the centroid via GEOS to work as intended you need to transform your coordinates from WGS84 to a coordinate reference system appropriate to polar regions; for Arctic regions I suggest EPSG:3995.
library(sp)
library(dplyr)
library(rgeos)
points_sp <- SpatialPoints(coords=data.frame(longitude=c(-135,-45,45,135),latitute=c(80,80,80,80)),proj4string = CRS('EPSG:4326'))
points_updated <- points_sp %>%
spTransform(CRS("EPSG:3995")) # a projected CRS apropriate for Arctic regions
centroid <- gCentroid(points_updated) %>%
spTransform(CRS("EPSG:4326")) # back to safety of WGS84!
centroid # looks better now...
# SpatialPoints:
# x y
# 1 0 90
# Coordinate Reference System (CRS) arguments: +proj=longlat +datum=WGS84 +no_defs
Also note that your workflow - while not wrong in principle - is a bit dated, and the {rgeos} package is approaching its end of life.
It may be good time to give a strong consideration to {sf} package, which is newer, actively developed and can, via interface to s2 library from Google, handle spherical geometry operations.
For an example of {sf} based workflow consider this code; the result (centroid = North Pole) is equivalent to the sp / rgeos one.
library(sf)
points_sf <- points_sp %>% # declared earlier
st_as_sf()
centroid_sf <- points_sf %>%
st_union() %>% # unite featrues / from 4 points >> 1 multipoint
st_centroid()
centroid_sf # the North Pole in a slightly different (sf vs sp) format
# Geometry set for 1 feature
# Geometry type: POINT
# Dimension: XY
# Bounding box: xmin: 0 ymin: 90 xmax: 0 ymax: 90
# Geodetic CRS: WGS 84 (with axis order normalized for visualization)
# POINT (0 90)
Related
I have a spatial data frame of the longitude and latitude of wildfire origins that I am trying to perform a spatial join on to a US Census TIGER/Line shapefile (places) to see if/where there is spatial intersection of fire origins and places.
I am converting the longitude and latitude to coordinate geometry using st_as_sf then attempting to st_join this to the places file, but am encountering an error as the CRS are different. The shapefile is in NAD83 projection, so I am attempting to match that.
library(tidyverse)
library(sf)
> head(fires)
# Longitude Latitude FireName
#1 -106.46667 34.66000 TRIGO
#2 -81.92972 35.87111 SUNRISE
#3 -103.76944 37.52694 BRIDGER
#4 -122.97556 39.37500 BACK
#5 -121.15611 39.62778 FREY
#6 -106.38306 34.77056 BIG SPRING
#convert df to sf
fires_sf <- st_as_sf(fires, coords = c("Longitude", "Latitude"), crs = 4269, agr = "constant")
head(fires_sf$geometry)
#Geometry set for 6 features
#Geometry type: POINT
#Dimension: XY
#Bounding box: xmin: -122.9756 ymin: 34.66 xmax: -81.92972 ymax: 39.62778
#Geodetic CRS: NAD83
#POINT (-106.4667 34.66)
#POINT (-81.92972 35.87111)
#POINT (-103.7694 37.52694)
#POINT (-122.9756 39.375)
#POINT (-121.1561 39.62778)
head(places$geometry)
#Geometry set for 6 features
#Geometry type: MULTIPOLYGON
#Dimension: XY
#Bounding box: xmin: -1746916 ymin: -395761.6 xmax: -1655669 ymax: -212934.8
#Projected CRS: USA_Contiguous_Albers_Equal_Area_Conic
#First 5 geometries:
#MULTIPOLYGON (((-1657066 -233757.7, -1657192 -2...
#MULTIPOLYGON (((-1668181 -273428.5, -1669420 -2...
#MULTIPOLYGON (((-1735046 -389578.2, -1735146 -3...
#MULTIPOLYGON (((-1732841 -376703.9, -1732642 -3...
#MULTIPOLYGON (((-1693749 -377716, -1693286 -377..
joined <- st_join(places, fires_sf)
Error in st_geos_binop("intersects", x, y, sparse = sparse, prepared = prepared, :
st_crs(x) == st_crs(y) is not TRUE
To work around this, I have tried st_transform to change the projection to longitude and latitude coordinates, as the places shapefile may be using UTM coordinates, and the datum to NAD83 in both spatial frames. I am getting an error for this as well.
#transform CRS projections
places_transform <- st_transform(places, "+proj=longlat +datum=NAD83")
fires_sf_transform <- st_transform(fires_sf, "+proj=longlat +datum=NAD83")
joined_new <- st_join(places_transform, fires_sf_transform)
Error in s2_geography_from_wkb(x, oriented = oriented, check = check) :
Evaluation error: Found 1045 features with invalid spherical geometry.
[3] Loop 0 is not valid: Edge 280 has duplicate vertex with edge 306
I have attempted to convert the geometry from longitude and latitude coordinates in the fires dataset to UTM coordinates to match the places shapefile, but this was also unsuccessful.
Any advice on how I can properly perform the spatial join of these points and multipolygons would be greatly appreciated.
I'm working with shapefiles of subdivisions of a number of countries, and for one country (Iceland), the X and Y coordinates appear to be swapped around in the shapefile.
The data can be downloaded here: shapefile data; IS_50V:mork_kjordaemi is the relevant dataset, select the "shape-zip" option in the download dropdown menu.
I've been using the "sf" package in R for all the shapefile work, and it has worked flawlessly with all the other shapefile data I have.
library(sf)
ic_2003 <- downloaded_data
st_crs(ic_2003) gives me
Coordinate Reference System:
User input: ISN2016
wkt:
GEOGCRS["ISN2016",
DATUM["Islands Net 2016",
ELLIPSOID["GRS 1980",6378137,298.257222101,
LENGTHUNIT["metre",1]]],
PRIMEM["Greenwich",0,
ANGLEUNIT["degree",0.0174532925199433]],
CS[ellipsoidal,2],
AXIS["geodetic latitude (Lat)",north,
ORDER[1],
ANGLEUNIT["degree",0.0174532925199433]],
AXIS["geodetic longitude (Lon)",east,
ORDER[2],
ANGLEUNIT["degree",0.0174532925199433]],
USAGE[
SCOPE["unknown"],
AREA["Iceland"],
BBOX[59.96,-30.87,69.59,-5.55]],
ID["EPSG",8086]]
head(ic_2003) gives me
Simple feature collection with 6 features and 15 fields
Geometry type: MULTIPOLYGON
Dimension: XY
Bounding box: xmin: 63.29577 ymin: -24.53268 xmax: 66.56644 ymax: -13.49462
Geodetic CRS: ISN2016
I've tried ic_2003 <- st_transform(ic_2003, 4326) but this doesn't fix the problem.
I've also tried ic_2003 <- st_transform(ic_2003, pipeline = "+proj=pipeline +step +proj=axisswap +order=2,1"), as done here , but this also does not solve the issue.
If I plot the data
ggplot(ic_2003) +
geom_sf() +
coord_sf()
I get the right shape, but rotated 90 degrees and in the wrong place on a world map.
Any help you could give me would be greatly appreciated.
There must be a sf way of doing this easily, but you can also use purrr::modify (which works like map) to swap all of the geometry lat/lon columns (a matrix within a list within a list within a list) without changing the sf attributes...
library(sf)
library(tidyverse)
ic_2003 <- st_read("mork_kjordaemiPolygon.shp") #from link above
ic_2003 <- ic_2003 %>%
mutate(geometry = modify(geometry, modify, ~list(.[[1]][,c(2,1)])))
ggplot(ic_2003) +
geom_sf() +
coord_sf()
I'm trying to reduce the size of sf object by applying st_simplify. CRS is 4267 and try to play around with the right level of dTolerance. I understand that the unit of dTolerance has to be that of the CRS, so I started with 0.1, but I constantly getting this error message.
test <- st_read("comm_sf.shp") %>%
+ st_simplify(preserveTopology = T,
+ dTolerance = 0.1)
Simple feature collection with 11321 features and 21 fields
geometry type: MULTIPOLYGON
dimension: XY
bbox: xmin: -124.4375 ymin: 24.5441 xmax: -66.94983 ymax: 49.00249
epsg (SRID): 4326
proj4string: +proj=longlat +datum=WGS84 +no_defs
Warning message:
In st_simplify.sfc(st_geometry(x), preserveTopology, dTolerance) :
st_simplify does not correctly simplify longitude/latitude data, dTolerance needs to be in decimal degrees
I play around with both setting dTolerance = 1000 (in case it's in meters) and dTolerance = 0.1 (in case it's in long/lat), but I get the same error message. This happens with CRS = 4267 as well. How can I fix this?
Well its a warning rather than an error. But in general you should do Douglas-Peucker on a projected coordinate system - because it uses a distance as a buffer, whereas the actual size of a unit of longitude varies with latitude. Note that the unit used by st_simplify tolerance will always be in the same as the map units.
Here's a reproducible example:
library(sf)
library(maptools)
states = st_as_sf(maps::map("state", plot = FALSE, fill = TRUE))
states_simple = st_simplify(states)
##Warning message:
## In st_simplify.sfc(st_geometry(x), preserveTopology, dTolerance) :
## st_simplify does not correctly simplify longitude/latitude data, dTolerance needs to be in decimal degrees
But if we transform to a projected coordinate system first, then no warning:
states = st_transform(states, 54032) #azimuthal equidistant
states_simple = st_simplify(states)
You can always go back to WGS84 lat-long after the simplification
states = st_transform(states, 4326)
I have a polygon (zones) and a set of coordinates (points). I'd like to create a spatial kernal density raster for the entire polygon and extract the sum of the density by zone. Points outside of the polygon should be discarded.
library(raster)
library(tidyverse)
library(sf)
library(spatstat)
library(maptools)
load(url("https://www.dropbox.com/s/iv1s5butsx2v01r/example.RData?dl=1"))
# alternatively, links to gists for each object
# https://gist.github.com/ericpgreen/d80665d22dfa1c05607e75b8d2163b84
# https://gist.github.com/ericpgreen/7f4d3cee3eb5efed5486f7f713306e96
ggplot() +
geom_sf(data = zones) +
geom_sf(data = points) +
theme_minimal()
I tried converting to ppp with {spatstat} and then using density(), but I'm confused by the units in the result. I believe the problem is related to the units of the map, but I'm not sure how to proceed.
Update
Here's the code to reproduce the density map I created:
zones_owin <- as.owin(as_Spatial(zones))
pts <- st_coordinates(points)
p <- ppp(pts[,1], pts[,2], window=zones_owin, unitname=c("metre","metres"))
ds <- density(p)
r <- raster(ds)
plot(r)
Units are difficult when you work directly with geographic coordinates (lon, lat). If possible you should convert to planar coordinates (which is a requirement for spatstat) and proceed from there. The planar coordinates would typically be in units of meters, but I guess it depends on the specific projection and underlying ellipsoid etc. You can see this answer for how to project to planar coordinates with sf and export to spatstat format using maptools. Note: You have to manually choose a sensible projection (you can use http://epsg.io to find one) and you have to project both the polygon and the points.
Once everything is in spatstat format you can use density.ppp to do kernel smoothing. The resulting grid values (object of class im) are intensities of points, i.e., number of points per square unit (e.g. square meter). If you want to aggregate over some region you can use integral.im(..., domain = ...) to get the expected number of points in this region for a point process model with the given intensity.
I'm not sure if this answers all of your question, but should be a good start. Clarify in a comment or in your question should you need a different type of output.
It removes all points that are not inside one of the 'zone' polygons, counts them by zone and plots the zones colored by the number of points that fall within.
library(raster)
library(tidyverse)
library(sf)
#> Linking to GEOS 3.6.2, GDAL 2.2.3, PROJ 4.9.3
library(spatstat)
library(maptools)
#> Checking rgeos availability: TRUE
load(url("https://www.dropbox.com/s/iv1s5butsx2v01r/example.RData?dl=1"))
# alternatively, links to gists for each object
# https://gist.github.com/ericpgreen/d80665d22dfa1c05607e75b8d2163b84
# https://gist.github.com/ericpgreen/7f4d3cee3eb5efed5486f7f713306e96
p1 <- ggplot() +
geom_sf(data = zones) +
geom_sf(data = points) +
theme_minimal()
#Remove points outside of zones
points_inside <- st_intersection(points, zones)
#> although coordinates are longitude/latitude, st_intersection assumes that they are planar
#> Warning: attribute variables are assumed to be spatially constant throughout all
#> geometries
nrow(points)
#> [1] 308
nrow(points_inside)
#> [1] 201
p2 <- ggplot() +
geom_sf(data = zones) +
geom_sf(data = points_inside)
points_per_zone <- st_join(zones, points_inside) %>%
count(LocationID.x)
#> although coordinates are longitude/latitude, st_intersects assumes that they are planar
p3 <- ggplot() +
geom_sf(data = points_per_zone,
aes(fill = n)) +
scale_fill_viridis_c(option = 'C')
points_per_zone
#> Simple feature collection with 4 features and 2 fields
#> geometry type: POLYGON
#> dimension: XY
#> bbox: xmin: 34.0401 ymin: -1.076718 xmax: 34.17818 ymax: -0.9755066
#> epsg (SRID): 4326
#> proj4string: +proj=longlat +ellps=WGS84 +no_defs
#> # A tibble: 4 x 3
#> LocationID.x n geometry
#> * <dbl> <int> <POLYGON [°]>
#> 1 10 129 ((34.08018 -0.9755066, 34.0803 -0.9757393, 34.08046 -0.975…
#> 2 20 19 ((34.05622 -0.9959458, 34.05642 -0.9960835, 34.05665 -0.99…
#> 3 30 29 ((34.12994 -1.026372, 34.12994 -1.026512, 34.12988 -1.0266…
#> 4 40 24 ((34.11962 -1.001829, 34.11956 -1.002018, 34.11966 -1.0020…
cowplot::plot_grid(p1, p2, p3, nrow = 2, ncol = 2)
It seems I underestimated the difficulty of your problem. Is something like the plot below (& underlying data) what you're looking for?
It uses raster with ~50x50 grid, raster::focal with a window of 9x9 using the mean to interpolate the data.
I want to obtain the latitude and longitude from a shapefile. Until now, I only know how to read the shapefile.
library(rgdal)
centroids.mp <- readOGR(".","35DSE250GC_SIR")
But how I can extract the latitude and longitude from centroids.mp?
There's a few levels to this question.
You ask for longitude and latitude, but that may not be the coordinate system used by this object. You can get the coordinates like this
coordinates(centroids.mp)
Note that the "centroids" will be all of the coordinates if this is a SpatialPointsDataFrame, a list of all the line coordinates if this is a SpatialLinesDataFrame, and just the centroids if this is a SpatialPolygonsDataFrame.
The coordinates may be longitude and latitude, but the object may not know that. Use
proj4string(centroids.mp)
If that is "NA", then the object does not know (A). If it includes "+proj=longlat", the object does know and they are longitude/latitude (B). If it includes "+proj=" and some other name (not "longlat") then the object does know and it's not longitude/latitude (C).
If (A) you'll have to find out, or it might be obvious from the values.
If (B) you are done (though you should check assumptions first, these metadata can be incorrect).
If (C) you can (pretty reliably though you should check assumptions first) transform to longitude latitude (on datum WGS84) like this:
coordinates(spTransform(centroids.mp, CRS("+proj=longlat +datum=WGS84")))
Use coordinates(), like this:
library(maptools)
xx <- readShapePoints(system.file("shapes/baltim.shp", package="maptools")[1])
coordinates(xx)
# coords.x1 coords.x2
# 0 907.0 534.0
# 1 922.0 574.0
# 2 920.0 581.0
# 3 923.0 578.0
# 4 918.0 574.0
# [.......]
st_coordinates solve the problem, however it removes removes covariates linked to the coordinates from the sf object. here I share an alternative in case you need them:
# useful enough
sites_sf %>%
st_coordinates()
#> X Y
#> 1 -80.14401 26.47901
#> 2 -80.10900 26.83000
# alternative to keep covariates within a tibble/sf
sites_sf %>%
st_coordinates_tidy()
#> Joining, by = "rowname"
#> Simple feature collection with 2 features and 3 fields
#> geometry type: POINT
#> dimension: XY
#> bbox: xmin: -80.14401 ymin: 26.479 xmax: -80.109 ymax: 26.83
#> epsg (SRID): 4326
#> proj4string: +proj=longlat +datum=WGS84 +no_defs
#> # A tibble: 2 x 4
#> gpx_point X Y geometry
#> <chr> <dbl> <dbl> <POINT [°]>
#> 1 a -80.1 26.5 (-80.14401 26.47901)
#> 2 b -80.1 26.8 (-80.109 26.83)
complete reprex: https://avallecam.github.io/avallecam/reference/st_coordinates_tidy.html
source code: https://github.com/avallecam/avallecam/blob/master/R/spatially_useful.R