Combining geographic layers with different projections in R - r

EDIT: I have reworded the title question slightly, and adjusted the text to respond to the comment by #DWin.
Combining geographic layers that are projected and not projected can be challenging. Often, it seems, some transformation is necessary, as geographic layers come from different products and publishers.
I am aware that R has several tools to perform geographic transformations. For example:
For objects of class Spatial* in the sp package, the spTransform() function in the rgdal package can be used; and,
For objects of class Raster* in the raster package, the projectRaster() function can be used.
Here is a specific task that I would like to accomplish in R: Transform to UTM grid Zone 15N (Datum: NAD83) a polygons layer describing lakes in a UTM grid Zone 15N (Datum: NAD27) projection (this is in an ESRI shapefile format).

The useful thing here is the epsg database included in rgdal.
epsgs = make_EPSG()
subset(epsgs,grepl("15N",epsgs$note))
[etc]
code
2703 26715 # NAD27 / UTM zone 15N [etc]
2851 26915 # NAD83 / UTM zone 15N [etc]
[etc]
Those codes are what you need in spTransform. If your lakes are in a shapefile with that NAD27 projection, then:
require(maptools)
lakes = readShapeSpatial("lakes.shp")
proj4string(lakes)=CRS("+init=epsg:26715")
should give you the lakes as supplied (note I dont think readShapeSpatial will read a .prj file with a shapefile set, so I've set it here explicitly)
Now to convert to NAD83 datum version of UTM zone 15N:
lakes83 = spTransform(lakes,CRS("+init=epsg:26915"))
Rasters are a bit trickier since they generally involve a warp so that you end up with a regular grid in your projected coordinate system - you can't just transform the coordinates of the corners...

Related

How can I edit the values of coordinates within a geojson file?

I am trying to map a geojson file (A map of Alaska precincts, downloaded from the division of elections and then converted online to geojson) onto a choropleth map using folium, the problem is the coordinates are in 7-digit numbers like this:
[ -16624764.227, 8465801.1497 ]
I read on a similar post that this was most likely a US coordinate system like UTM or State Plane, and recommended using an API to reproject it. Is it also possible to access the coordinates directly such as with geopandas and divide them by 100000?
The data is most likely in a specific cartographic projection. You don't just want to divide by 100k - the data will likely have nonlinear transformations which have a different effect on the position depending on the location. See the GeoPandas docs on working with projections.
If the CRS of the data is correctly encoded, you can re-project the dataframe into lat/lons (e.g. WGS84, which has the EPSG Code 4326) using geopandas.GeoDataFrame.to_crs, e.g.:
df_latlon = df.to_crs("epsg:4326")

Preserving geometry types when using st_intersection from sf package in R

I am trying to get accustomed to using the sf package in R.
Currently, I am trying to clip a shapefile of all zip codes in the US to the border of the state of Utah. I have downloaded a shapefile for the Utah border and a shapefile of all zip code tabulation areas in the US and matched their projection information.
I found that creating a clip from two data files using st_intersects() followed by st_intersection() works well. st_intersects() saves only the zip codes that intersect with the border of Utah, then st_intersection() should clip the zip codes that touch or intersect with the Utah border to the exact Utah border.
When I run the st_intersects() command on the zip code file, the result preserves the multipolygon geometries of all of the zip codes. When I run st_intersection(), it creates a new sf object that has a mix of geometries - points, polygons, and multipolygons. I want the resulting sf file to only have multipolygons. I was able to remove the point geometries using this code:
zips_utah <- zips_utah %>%
filter(st_geometry_type(.) != "POINT")
However, I still have an sf object with a mix of multipolygons and polygons. When I try to cast the polygons to multipolygons using st_cast(), I get this error:
only first part of geometrycollection is retainedError in st_cast.POINT(x[[1]], to, ...) :
cannot create MULTIPOLYGON from POINT
It seems that points still exist in the geometries of of my zip code shapefile, perhaps under the guise of "GEOMETRYCOLLECTION"? When I check the geometry type using unique(st_geometry_type(zips_utah)), this is the result I get:
[1] MULTIPOLYGON POLYGON
[3] GEOMETRYCOLLECTION
18 Levels: GEOMETRY POINT LINESTRING ... TRIANGLE
My understanding is that I cannot save the resulting clipped zip code sf object to a shapefile unless I first convert all the geometries to the same type. When I try to save the clipped zip code file using st_write(), I get the following error:
Error 1: Attempt to write non-polygon (POINT) geometry to POLYGON type shapefile.
I am wondering:
How do I preserve geometry types when running st_intersection?
Secondarily, does anyone have good resources on how the sf package handles geometries and mixed geometry types? What is the usefulness of this?

proj4string doesn't seem to completely define ESRI projection information

I have a shapefile that I imported into R using readOGR from the rgdal package. I do a little bit of work with it, like adding attribute information, etc, then export it as an ESRI shapefile again, with a new name. However, when I bring both the original and new shapefile into ArcGIS, it tells me that the CRS does not match.
So, noting that all the projection parameters remain the same, but the projection and coordinate system names are different, and the datum
name is dropped, my questionas are:
Is the second CRS the same as the first?
If so, why did the names change, and why does ArcGIS no longer recognize it as the same?
If not, how did it get changed?
Can the proj4string be modified to be more specific, and if so, why did readOGR not already do this to preserve all the information?
I can use the new shapefiles just fine, but it would be nice to know that
the CRS is identical to the original. And, I could of course define it again in ArcGIS, but part of the motivation to work in R
is to obviate the need to point and click for many files.
I appreciate any insights or enlightenment.
Here is the original projection information from ArcGIS:
Projected Coordinate System: NAD_1983_HARN_Transverse_Mercator
Projection: Transverse_Mercator
False_Easting: 520000.00000000
False_Northing: -4480000.00000000
Central_Meridian: -90.00000000
Scale_Factor: 0.99960000
Latitude_Of_Origin: 0.00000000
Linear Unit: Meter
Geographic Coordinate System: GCS_North_American_1983_HARN
Datum: D_North_American_1983_HARN
Prime Meridian: Greenwich
Angular Unit: Degree
Here is the proj4string from R, which also agrees with the proj4string given for this projection at www.spatialreference.org for epsg:3071 and also for SR-ORG:7396.
+proj=tmerc +lat_0=0 +lon_0=-90 +k=0.9996 +x_0=520000 +y_0=-4480000 +ellps=GRS80 +units=m +no_defs
When I use writeOGR to export the SpatialPolygonsDataFrame with the above proj4string, then bring it back into ArcGIS, the
projection information is given as the following, and is no longer recognized as the original.
Projected Coordinate System: Transverse_Mercator
Projection: Transverse_Mercator
false_easting: 520000.00000000
false_northing: -4480000.00000000
central_meridian: -90.00000000
scale_factor: 0.99960000
latitude_of_origin: 0.00000000
Linear Unit: Meter
Geographic Coordinate System: GCS_GRS 1980(IUGG, 1980)
Datum: D_unknown
Prime Meridian: Greenwich
Angular Unit: Degree
Perhaps not a definitive answer, but I posted this question on the R-sig-Geo list serve, and got a few possible work-around solutions. For now, I simply used an R-script to overwrite the .prj file with a copy of the original, and that seems to work fine. Also suggested was the use of a package called arcgisbinding to bridge ArcGIS and R (and maybe a similar solution would be available for QGIS?). I have not verified the arcgisbinding solution, but more information can be found at the the blog post here and in the package documentation here.

Create buffer around spatial data in R

I have a spatial dataset of shopping centers that I would like to create buffers around in R.
I think these packages will be useful:
require(maptools)
require(geosphere)
I was able to do so for a set of coordinates, but not for spatial data. The code looks like this:
coordinates(locs) <- c("Longitude", "Latitude") # set spatial coordinates
fivekm <- cbind(coordinates(locs), X=rowSums(distm (coordinates(locs)[,1:2], fun = distHaversine) / 1000 <= 5)) # number of points within 5 km
But I don't know what function/package to use for a set of polygons. Can someone please advise on the function (or code) and I will go from there?
Thanks!
In library rgeos, there is the gBuffer function that works with SpatialPoints or SpatialPolygons.
The width parameter allows to set the distance to which you want to buffer. However, be careful, this distance is in the scale of the coordinates system used. Thus, in degrees and not in meters with non-projected data. As suggested by #Ege Rubak, you will have to project your data with spTransform first (be sure to use the appropriate CRS according to your location).
As for now, rgeos library works with library sp, but not (yet?) with the recent sf.
I think the only option at the moment is to project your longitude and latitude points to a flat map and then do everything there. As far as I know there are no packages for doing polygonal geometry on the sphere yet (I'm working on one, but there is no ETA).
Projection used to be done with spTransform from the sp package, but now it may be more convenient to use the more modern simple features package sf which has the function st_transform. The vignette https://cran.r-project.org/web/packages/sf/vignettes/sf1.html has a section called "Coordinate reference systems and transformations" to help you with this part. The buffering is described in the section "Geometrical operations".
The two previous post have covered the details but I thought it might be helpful to provide a workflow. This is assuming you have you are using points of lat and long. What is your original spatial data format?
Convert your coordinates into a Spatial Points Dataframe SpatialPointsDataFrame and assign it a geographic CRS (proj4) that matches your coordinate data (probably WGS84)
Change the projection to a local projected CRS with preferred units
Apply buffer to spatial point data frame, the width will now be in more usable units

converting ordanance survey coordinates into valid esri coords

I've been searching extensively for a way of converting from ordanance survey coords to valid esri coordinates. I've found quite a few pages that convert to lat long (if a little off) but nothing to convert to esri (which I believe is utm.)
This is for use in python or JavaScript / actionscript etc - I'm not too worried about syntax more an understanding of the maths involved.
Thanks
Ian
This type of conversion is called a "geodetic transformation". OS and UTM are both "transverse mercator" projections, wherein the ellipsoid of the earth is unwrapped into a cylinder, which is then unrolled into a flat sheet and sub-divided into grid sections. OS coordinates are specific to regions (eg: OSGB for Great Britain), whereas UTM is a "universal" system and specifies a system of grids for the whole earth. Regional grids are used in order to reduce the side-effects of distortion introduced by the mercator projection. It follows that converting between such systems is possible, but can also be quite complex depending in the accuracy desired.
It seems there are only indirect methods, as you have already referred to, the most common being to convert from OSGB36 to WGS84 (lat/long) and then to UTM.
Here are some resources which might be helpful:
Convert WGS84 lat/long to UTM: http://www.uwgb.edu/dutchs/usefuldata/utmformulas.htm. Note the inclusion of specific parameters for each region. For example, if you were converting coordinates for Britain, the parameters for "Airy 1830" would be used. (also links to a spreadsheet and webpage with conversions).
Similar information as above on Wikipedia.
JavaScript to convert OSGB36 to WGS84 (7 metre accuracy): http://www.nearby.org.uk/tests/GeoTools.html
A more accurate JavaScript conversion using a Helmert transformation (5 metre accuracy): http://www.movable-type.co.uk/scripts/latlong-convert-coords.html and http://www.movable-type.co.uk/scripts/latlong-gridref.html
Comprehensive coverage of the OSGB36 coordinate system, including transformations to and from other coordinate systems: http://www.ordnancesurvey.co.uk/oswebsite/gps/docs/A_Guide_to_Coordinate_Systems_in_Great_Britain.pdf
Miscellaneous links and resources: http://www.ordnancesurvey.co.uk/oswebsite/gps/information/resourceinfolinks/gpslinks.html
As for accuracy, it is summed up in this excerpt from ordnancesurvey.co.uk:
... OSGB36 contains randomly variable scale errors, mainly due to it being
computed in blocks and the fact that scale and azimuth were
controlled entirely by the 11 stations from the
Principle Triangulation. These scale variations
mean that OSGB36 can be described as inhomogeneous ...
The inhomogenity of OSGB36 does not affect its
adequacy as a mapping datum but it does make a
simple transformation between ETRS89 and OSGB36 too inaccurate for national use.
For example, the accuracy of a national 7 parameter (3
shifts, 3 rotations and a scale change) transformation is approximately 5 metres
Here is a link to more comprehensive information regarding the ARC/INFO file format.
Quick google search: http://google-maps-utility-library-v3.googlecode.com/svn/trunk/arcgislink/docs/examples.html

Resources