I have a shapefile, specifically, a SpatialPolygonsDataFrame, called cdtract. that contains one variable for each district called varcount. varcount is either "NA" or 1. When I do
plot(cdtract)
...I see the map printed out but I don't see the tracts that are '1' in varcount marked any differently from those that are 'NA.' I ideally want to have a spectrum of values in varcount and see those reflected in different colors. I considered using ggplot but according to this post here that is very resource intensive and it might be better to use plot() instead of ggplot() to create the choropleth. But I'm really not sure how to go about it. I am using a different shapefile from states/countries so am not sure that the choroplethr package is the way to go.
Can someone explain how to take a SpatialPolygonsDataFrame and turn it into a choropleth, efficiently? Thanks!
Shamefully taken from Ari's post on gis.stackexchange.com. See the link for pictures. There are more examples here.
library(sp)
Srs1 = Polygons(list(Polygon(cbind(c(2,4,4,1,2),c(2,3,5,4,2)))), "s1")
Srs2 = Polygons(list(Polygon(cbind(c(5,4,2,5),c(2,3,2,2)))), "s2")
SpDF <- SpatialPolygonsDataFrame( SpatialPolygons(list(Srs1,Srs2)),
data.frame(z=1:2, row.names=c("s1","s2") ) )
spplot(SpDF, zcol="z")
Related
I'm fairly new to the OSM world and I need to extract all water related polygons from the OSM planet file, except for ocean polygons. I know there is some product from a university in Tokyo, but it's from 2016 and I need it as up to date as possible.
I already extracted a good bit of it with the following code. However, comparing the resulting layers with the OSM basemap in QGIS I noticed that some parts are missing. Even though they have the same flags and relations like other parts that were extracted. I know that some parts of rivers are digitalized as lines and not polygons, so it's okay that those are missing. The missing parts are definitely polygons, since I could extract one of them with the same flags through the QuickOSM plug-in in QGIS. Also the OSM Basemap shows clearly that those areas must be polygons (see screenshot).
Is there a mistake in my code or did I make an mistake with the flags? My code however throws no errors and everything seems to be working except for the missing parts.
Thanks in advance!
Here is the code so far:
library(gdalUtils)
library(rgdal)
library(sf)
# extracting all layers with flag "natural = water"
path_pbf <- "path/to/planet_file.osm.pbf"
ogr2ogr(src_datasource_name = path_pbf,
"OSM_Waterbodies.gpkg",
f = "GPKG",
sql = "SELECT * FROM multipolygons WHERE natural = 'water'",
progress = T)
# extracting all layers with flag "other_tags LIKE waterway"
ogr2ogr(src_datasource_name = path_pbf,
"OSM_Waterways.gpkg",
f = "GPKG",
sql = "SELECT * FROM multipolygons WHERE other_tags LIKE '%waterway%'",
progress = T)
waterways <- st_read("OSM_Waterways.gpkg")
waterways$rm <- NA
# select only certain polygons since waterways also includes dams etc
check <- "*riverbank*|*river*|*stream*|*tidal channel*|*canal*|*drain*|*ditch*|*lake*"
# mark polygons which are not part of the desired selection with "remove" flag
for(i in 1:nrow(waterways)){
if (!grepl(check, waterways$other_tags[i])){
waterways$rm[i] <- "Remove"
}
}
# drop rows with "remove" flag
index <- which(waterways$rm == "Remove")
waterways <- waterways[-index,]
st_write(waterways, "OSM_Waterways_clean.gpkg", driver = "GPKG")
P.S.: The code is probably not the most efficient one, but it's not about efficiency, since I will probably run it once or twice.
It looks like you're only extracting multiploygons, which are used in OSM when a shape isn't a simple polygon. This means that river sections with islands in them will be extracted, but many simple river sections will not, as they are just mapped as closed ways (An example from your screenshot). I don't have the OSM Global file on hand to check, but I would imagine that it's as simple as running the ogr2ogr functions again with ways instead of multipolygons in the SQL, and then checking that the ways are closed (likely, check that the first and last nodes are identical, as a quick search suggests that ogr2ogr doesn't provide a way to check for closed ways explicitly).
Novice R programmer here... Looking for guidance on building a tess out of the polygons in a SpatialPolygonsDataFrame.
I am invoking quadratcount on points within a state boundary. Rather than using the default grid, I would like to use custom polygons to define the quadrats. Specifically, the county polygons which I have in shapefile format.
I take it from the documentation that the desired tesselation can be created out of a list of 'owin' objects. Where I'm getting jammed up is in taking my SpatialPolygonsDataFrame to generate that list.
I have confirmed that the polygons are read in correctly:
counties <- readOGR('/path/to/counties.shp', layer = "CountyBoundaries", GDAL1_integer64_policy = FALSE)
for(i in 1:nrow(counties)) {
plot(counties[i,])
}
Which generates a series of plots, one per county. That is, of course, only useful to know that my data isn't broken and that I can iterate over the polygons. What I think I need to do is make an owin out of each polygon in the SpatialPolygonsDataFrame and append that to myList for tess(tiles=myList). Not having much success in that approach.
My strong suspicion is that there's an easier way...
Many Thanks,
--gt
Turns out my problem was in not fully understanding how lists are indexed in R. The following bit of code gives me the result I want.
I have no doubt that there is a better, vectorized, way to do it. But in the mean while:
# The point events are in a PPP: StateCrimes_ppp
counties <- readOGR('/path/to/counties.shp', layer = "CountyBoundaries", GDAL1_integer64_policy = FALSE)
tlist <- list()
for(i in 1:nrow(counties)) {
tlist[[i]] <- as(counties[i,], 'owin')
}
QuadCount <- quadratcount(
StateCrimes_ppp,
tess=tess(tiles=tlist)
)
plot(QuadCount, main=NULL)
plot(intensity(QuadCount, image=TRUE), main=NULL, las=1)
If anybody sees how I've taken the long and hard way to solve a simple problem, I'd love to know a better, simpler, more elegant, or more R-like way to do this.
Thanks again,
--gt
Is there an easy way to convert a Spatial Lines into a Spatial Polygon object within R?
Reproducible Example
I have put together a reusable dataset here, which is downloaded from OpenStreetMaps through the overpass package. This extracts the locations of a few airports in South England:
devtools::install_github("hrbrmstr/overpass")
library(overpass)
library(raster)
library(sp)
# Write Query
query_airport <- '
(node["aeroway"="aerodrome"](50.8, -1.6,51.1, -1.1);
way["aeroway"="aerodrome"](50.8, -1.6,51.1, -1.1);
relation["aeroway"="aerodrome"](50.8, -1.6,51.1, -1.1);
);
out body;
>;
out skel qt;
'
# Run query
shp_airports <- overpass::overpass_query(query_airport, quiet = TRUE)
crs(shp_airports) <- CRS("+init=epsg:4326") # Add coordinates
shp_airports <- shp_airports[,1]
# Plot Results
plot(shp_airports, axes = T)
However, the data is of the class "SpatialLinesDataFrame". This really messes things up if you want to do any form of spatial joins or intersections, as it only acknowledges the edge of the region.
Potential Leads
I was exploring the use of SpatialLines2PolySet within the maptools package, but in my time exploring I produced nothing but error codes, so I didn't think there would be any worth including these within the question. There is some guidance about these functions here: https://rdrr.io/rforge/maptools/man/SpatialLines2PolySet.html
Notes
I have searched the web and SO to see find similar questions and struggled to find any questions directly referring to this. A lot seem to reference converting SpatialPoints -> SpatialLineDataFrames , but not SpatialLineDataFrames -> SpatialPolygonDataFrames. This question is similar but lacks any answers (or a reproducible dataset): Close a spatial line into a polygon using a shapefile
In addition, it seems strange that this would be difficult as it is something which can be done so easily in ArcGIS using the "Feature to Polygon" tool. This function requires no additional arguments specified and it works perfectly.
A way to solve the problem would be to use the library sf. After your query
library(sp)
library(raster)
library(sf)
sf_airports <- st_as_sf(shp_airports)
sf_airports_polygons <- st_polygonize(sf_airports)
shp_airports <- as(sf_airports_polygons, "Spatial") # If you want sp
class(shp_airports)
Below is a JavaScript page I have created that allows me add and freely move markers on the map. From this map I can figure out the regions I am interested in.
Basically what I want to do is show the same map using ggplot2/MarMap with coastline indicators + bathymetry data. I am really just interested in getting bathymetry data per GPS location, basically getting negative/positive elevation per Lat+Long, so I was thinking if I can plot it then I should be able to export data to a Database. I am also interested in coastline data, so I want to know how close I am (Lat/Long) to coastline, so with plot data I was also going to augment in DB.
Here is the R script that I am using:
library(marmap);
library(ggplot2);
a_lon1 = -79.89836596313478;
a_lon2 = -79.97179329675288;
a_lat1 = 32.76506070891712;
a_lat2 = 32.803624214389615;
dat <- getNOAA.bathy(a_lon1,a_lon2,a_lat1,a_lat2, keep=FALSE);
autoplot(dat, geom=c("r", "c"), colour="white", size=0.1) + scale_fill_etopo();
Here is the output of above R script:
Questions:
Why do both images not match?
In google-maps I am using zoom value 13. How does that translate in ggplot2/MarMap?
Is it possible to zoom in ggplot2/MarMap into a (Lat/Long)-(Lat/Long) region?
Is it possible to plot what I am asking for?
I don't know how you got this result. When I use your script, I get an error since the area your are trying to fetch from the ETOPO1 database using getNOAA.bathy() is too small. However, adding resolution=1 (this gives the highest possible resolution for the ETOPO1 database), here is what I get:
To answer your questions:
Why do both images not match?
Probably because getNOAA.bathy() returned an error and the object dat you're using has been created before, using another set of coordinates
In google-maps I am using zoom value 13. How does that translate in ggplot2/MarMap?
I have no clue!
Is it possible to zoom in ggplot2/MarMap into a (Lat/Long)-(Lat/Long) region?
I urge you to take a look at section 4 of the marmap-DataAnalysis vignette. This section is dedicated to working with big files. You will find there that you can zoom in any area of a bathy object by using (for instance) the subsetBathy() function that will allow you to click on a map to define the desired area
Is it possible to plot what I am asking for? Yes, but it would be much easier to use base graphics and not ggplot2. Once again, you should read the package vignettes.
Finally, regarding the coastline data, you can use the dist2isobath() function to compute the distance between any gps point and any isobath, including the coastline. Guess where you can learn more about this function and how to use it...
There are clearly a number of packages in R for all sorts of spatial analysis. That can by seen in the CRAN Task View: Analysis of Spatial Data. These packages are numerous and diverse, but all I want to do is some simple thematic maps. I have data with county and state FIPS codes and I have ESRI shape files of county and state boundaries and the accompanying FIPS codes which allows joining with the data. The shape files could be easily converted to other formats, if needed.
So what's the most straight forward way to create thematic maps with R?
This map looks like it was created with an ESRI Arc product, but this is the type of thing I would like to do with R:
alt text http://www.infousagov.com/images/choro.jpg Map copied from here.
The following code has served me well. Customize it a little and you are done.
(source: eduardoleoni.com)
library(maptools)
substitute your shapefiles here
state.map <- readShapeSpatial("BRASIL.shp")
counties.map <- readShapeSpatial("55mu2500gsd.shp")
## this is the variable we will be plotting
counties.map#data$noise <- rnorm(nrow(counties.map#data))
heatmap function
plot.heat <- function(counties.map,state.map,z,title=NULL,breaks=NULL,reverse=FALSE,cex.legend=1,bw=.2,col.vec=NULL,plot.legend=TRUE) {
##Break down the value variable
if (is.null(breaks)) {
breaks=
seq(
floor(min(counties.map#data[,z],na.rm=TRUE)*10)/10
,
ceiling(max(counties.map#data[,z],na.rm=TRUE)*10)/10
,.1)
}
counties.map#data$zCat <- cut(counties.map#data[,z],breaks,include.lowest=TRUE)
cutpoints <- levels(counties.map#data$zCat)
if (is.null(col.vec)) col.vec <- heat.colors(length(levels(counties.map#data$zCat)))
if (reverse) {
cutpointsColors <- rev(col.vec)
} else {
cutpointsColors <- col.vec
}
levels(counties.map#data$zCat) <- cutpointsColors
plot(counties.map,border=gray(.8), lwd=bw,axes = FALSE, las = 1,col=as.character(counties.map#data$zCat))
if (!is.null(state.map)) {
plot(state.map,add=TRUE,lwd=1)
}
##with(counties.map.c,text(x,y,name,cex=0.75))
if (plot.legend) legend("bottomleft", cutpoints, fill = cutpointsColors,bty="n",title=title,cex=cex.legend)
##title("Cartogram")
}
plot it
plot.heat(counties.map,state.map,z="noise",breaks=c(-Inf,-2,-1,0,1,2,Inf))
Thought I would add some new information here since there has been some activity around this topic since the posting. Here are two great links to "Choropleth Map R Challenge" on the Revolutions blog:
Choropleth Map R Challenge
Choropleth Challenge Results
Hopefully these are useful for people viewing this question.
All the best,
Jay
Check out the packages
library(sp)
library(rgdal)
which are nice for geodata, and
library(RColorBrewer)
is useful for colouring. This map is made with the above packages and this code:
VegMap <- readOGR(".", "VegMapFile")
Veg9<-brewer.pal(9,'Set2')
spplot(VegMap, "Veg", col.regions=Veg9,
+at=c(0.5,1.5,2.5,3.5,4.5,5.5,6.5,7.5,8.5,9.5),
+main='Vegetation map')
"VegMapFile" is a shapefile and "Veg" is the variable displayed. Can probably be done better with a little work. I don`t seem to be allowed to upload image, here is an link to the image:
Take a look at the PBSmapping package (see borh the vignette/manual and the demo) and
this O'Reilly Data Mashups in R article (unfortunately it is not free of charge but it worth 4.99$ to download, according Revolutions blog ).
It is just three lines!
library(maps);
colors = floor(runif(63)*657);
map("state", col = colors, fill = T, resolution = 0)
Done!!
Just change the second line to any vector of 63 elements (each element between 0 and 657, which are members of colors())
Now if you want to get fancy you can write:
library(maps);
library(mapproj);
colors = floor(runif(63)*657);
map("state", col = colors, fill = T, projection = "polyconic", resolution = 0);
The 63 elements represent the 63 regions whose names you can get by running:
map("state")$names;
The R Graphics Gallery has a very similar map which should make for a good starting point. The code is here: www.ai.rug.nl/~hedderik/R/US2004 . You'd need to add a legend with the legend() function.
If you stumble upon this question in the 2020ies, use the magnificent tmap package. It's very simple and straightforward and revolutionized making maps in R. Do not bother to investigate this complicated code.
Check the vignette here.