How to load *part* of a multifeature geojson file in R? - r

I have a geojson that is a FeatureCollection containing 2 geographic data types: a LineString and a waypoint - see the raw file here - this is how it looks on GitHub:
I want to load only only the LineString, so this is what I do:
library(RCurl)
obj <- getURL("https://raw.githubusercontent.com/Robinlovelace/stplanr/master/inst/extdata/route_data.geojson")
writeLines(obj, "/tmp/obj.geojson")
obj <- readLines("/tmp/obj.geojson")
just_lines <- obj[14:(length(obj) - 28)]
just_lines[1] <- paste0("{", just_lines[1])
just_lines[length(just_lines)] <- "}"
writeLines(just_lines, "/tmp/just_lines.geojson")
Now we have removed the pesky lines at the beginning and end of the file, it's a nicely formed GeoJSON file that we can load and plot, yay:
library(rgdal)
route <- readOGR("/tmp/just_lines.geojson", layer = "OGRGeoJSON")
plot(route)
Except it should be obvious to any R user that this is a very clunky and inefficient way of doing this involving too many lines of code and unnecessary reading and writing to the hard disc. There must be another way!
Options I've looked at
geojsonio
jsonlite
leaflet, which can display the FeatureCollection but seemingly not extract its parts.
Context
I'm creating a package for sustainable transport planning, stplanr. A function to find cycling routes (like in the image below) needs to load in the FeatureCollection geojson data from the CycleStreets.net api.

Read the data using jsonlite direct from the URL:
obj <- jsonlite::fromJSON("https://raw.githubusercontent.com/Robinlovelace/stplanr/master/inst/extdata/route_data.geojson")
Convert the first object in the collection to SpatialLines:
sl = SpatialLines(list(Lines(list(Line(obj$features[1,]$geometry$coordinates[[1]])),ID=1)))
plot(sl)
That assumes the feature is a single line string.
To make a SpatialLinesDataFrame with the attributes:
sldf=SpatialLinesDataFrame(sl=sl,data=obj$features[1,]$properties)
Should probably also give it a CRS:
proj4string(sldf)=CRS("+init=epsg:4326")

I don't know if this is possible in LeafletR but Leaflet's L.GeoJSON layer has a filter method which can render (or not render) a collection's features based on the properties the feature has. Some code:
L.geoJson(geojson, {
'filter': function (feature) {
return feature.geometry.type === 'LineString'
}
});
An example: http://plnkr.co/edit/RXIO0X?p=preview
Reference: http://leafletjs.com/reference.html#geojson-filter

Related

how to add my own shape file trajLevel() in openair package R

I am using this following code for plotting the trajectories...........
library(openair)
load("GDASNDL1000m.Rdata")
trajLevel(traj,method="hexbin",col="jet",xbin=40,parameters=NULL,
orientation=c(90,0,0),projection="mercator")
result https://github.com/adeckmyn/maps/files/2667752/GDASNDL1000m.zip
Here, I would like to change the base world map with my own shape file.
my shape file is follows....
z1=maptools::readShapePoly("/home/sateeshm/shapefiles/ncmrwf/india_map")
library(maps)
map(z1)
https://github.com/adeckmyn/maps/files/2667336/World-India.zip
#
Now, the actual question is how to link z1 to trajLevel?
To avoid the hard-coded call to "world" in openair, you wil have to create a new world database in the same file based format as that of the "maps" package.
Probably the simplest way to do this, is to use the mapMaker package. This package is not on CRAN but can be found on github. It is the package I used to create the standard world map. The documentation is minimal, but if you don't care about polygon names etc, you can create a "quick and dirty" world map as follows:
# get your new map as a simple list of polygons (or lines)
z1=maps::map(maptools::readShapePoly("india_map"), plot=FALSE)
# create internal representation
z2=mapMaker::map.make(z1)
# write binary files:
mapMaker::map.export.bin(z2, "/my/path/to/world")
# To make map() call this new database:
library(maps)
worldMapEnv="MYMAP"
Sys.setenv("MYMAP"="/my/path/to/") # don't add the "world" !
now map("world") will draw your version of the world map.

How can I download GADM data in R?

library(raster)
france<-getData('GADM', country='FRA', level=1)
However, the command is leading me to this error.
trying URL 'http://biogeo.ucdavis.edu/data/gadm2.8/rds/FRA_adm1.rds'
Error in utils::download.file(url = aurl, destfile = fn, method = "auto", :
cannot open URL 'http://biogeo.ucdavis.edu/data/gadm2.8/rds/FRA_adm1.rds'
First, download the country data you want from the GADM database, and save it to your local directory. Be sure that you have chosen the R (SpatialPolygonsDataFrame) format. There are five levels available for France (from level 0 to level 5). You can choose what you need.
Second, read the .rds file downloaded from GADM with readRDS() function and transform it into a data.frame with ggplot2::fortify().
library(ggplot2)
library(sp)
# assumed that you downloaded into a such path: '~/Downloads/FRA_adm1.rds':
path <- file.path(Sys.getenv("HOME"), "Downloads", "FRA_adm1.rds")
# FR map (Level 1) from GADM version 2.8
frRDS <- readRDS(path)
# Region names 1 in data frame
frRDS_df <- ggplot2::fortify(frRDS, region = "NAME_1")
head(frRDS_df)
I am going to improve upon the previous answer to the OP's question.
To answer the OP's question directly and correctly, there is nothing wrong with the OP's code. The issue was likely a temporary internet connection issue because the OP's code works and retrieves the gadm.org data without issue. Note, the getData() function retrieves the gadm.org website's geodata that is stored and retrieved from the http://biogeo.ucdavis.edu/ website.
The raster package provides the getData() function which is very useful for automatically retrieving the geodata from the internet. This function can also be used to retrieve geodata that is kept locally on a PC.
In years past, the way to use geodata was to first download a file from the gadm.org website, and then to move that file from the download folder and save the file in a folder on the pc. These files then needed to be unpackaged/unzipped before the geodata was available to be used by R.
Using the getData() makes life simpler because this method directly retrieves the desired geodata and then makes the geodata available to use with R.
The gadm.org website clearly states:
"Downloading by country is the recommended approach"
Even though downloading the large world geodata file directly from the website can be done, it is unnecessary and resource intensive. Unless there is some specific reason for doing so, there is absolutely no need to download and keep the large worldwide geodata database on the PC.
And one last thing about the getData() function. This function is currently generating a warning when it is used in R nowadays. The warning reads:
Warning message in getData("GADM", country = "USA", level = 1):
"getData will be removed in a future version of raster.
Please use the geodata package instead"

adding custom map to maps package

I want to be able to use map.where function on a map that is not currently available in the MapEnv in the maps package--such as these maps of Brazil: http://www.usp.br/nereus/?dados=brasil. They go into a more granular level than what is available in the maps package.
Is there anyway to add them to the package data so that they can be used by maps.where?
Yes, it is usually possible to load shp files into map(). You will need extra packages to read the shp files first, though. Also, you will have to know the name of the field that names the polygons.
For instance, using one of the maps from your link:
> ufebrasil <- rgdal::readOGR("UFEBRASIL.shp")
> names(ufebrasil)
[1] "ID" "CD_GEOCODU" "NM_ESTADO" "NM_REGIAO"
> mymap=maps::SpatialPolygons2map(ufebrasil, namefield="NM_ESTADO")
> map.where(mymap, -48.6, -26.46)
[1] "SANTA CATARINA:1"
You can also simply call
mymap=maps::map(ufebrasil, namefield="NM_ESTADO")
to plot a map and give the same map data as above (map() will call SpatialPolygons2map automatically if necessary).

Using shapefile as an input to the user defined function using R

I have a script to create a randomly distributed square polygons in KML format which takes in shapefile with a single polygon as an input which works absolutely well. The problem arise when I tried to create the user defined function out of it. I used readShapePoly() function to read the shapefile and it works well when used out of the function. But when the function is created in which shapefile should be given as an input, it simply wont take. It shows this error message
Error in getinfo.shape(filen) : Error opening SHP file
I avoid writing extensions and I do have all the extension files to create the shapefile.
Part of the script to read the shapefile using it as the input file:
library(maptools)
library(sp)
library(ggplot2)
Polytokml <- function(shapefile_name){
###Input Files
file1 <- "shapefile_name"
Shape_file <- readShapePoly(file1) #requires maptools
return(Shape_file)
}
The function is created but it doesn't work if the function is called.
>Polytokml(HKH.shp)
Error in getinfo.shape(filen) : Error opening SHP file
This works well out of the function.
###Input Files
file1 <- "shapefile.shp"
Shape_file <- readShapePoly(file1) #requires maptools
This is just an example out of the whole script in which different arguments are taken as an input. So just to make things simple I have added script to read the shapefile which has been a problem now. Do let me know if it is not clear.
Thank you so much in advance :)

Creating Shapefiles in R

I'm trying to create a shapefile in R that I will later import to either Fusion Table or some other GIS application.
To start,I imported a blank shapefile containing all the census tracts in Canada. I have attached other data (in tabular format) to the shapefile based on the unique ID of the CTs, and I have mapped my results. At the moment, I only need the ones in Vancouver and I would like to export a shapefile that contains only the Vancouver CTs as well as my newly attached attribute data.
Here is my code (some parts omitted due to privacy reasons):
shape <- readShapePoly('C:/TEST/blank_ct.shp') #Load blank shapefile
shape#data = data.frame(shape#data, data2[match(shape#data$CTUID, data2$CTUID),]) #data2 is my created attributes that I'm attaching to blank file
shape1 <-shape[shape$CMAUID == 933,] #selecting the Vancouver CTs
I've seen other examples using this: writePolyShape to create the shapefile. I tried it, and it worked to an extent. It created the .shp, .dbf, and .shx files. I'm missing the .prj file and I'm not sure how to go about creating it. Are there better methods out there for creating shapefiles?
Any help on this matter would be greatly appreciated.
Use rgdal and writeOGR. rgdal will preserve the projection information
something like
library(rdgal)
shape <- readOGR(dsn = 'C:/TEST', layer = 'blank_ct')
# do your processing
shape#data = data.frame(shape#data, data2[match(shape#data$CTUID, data2$CTUID),]) #data2 is my created attributes that I'm attaching to blank file
shape1 <-shape[shape$CMAUID == 933,]
writeOGR(shape1, dsn = 'C:/TEST', layer ='newstuff', driver = 'ESRI Shapefile')
Note that the dsn is the folder containing the .shp file, and the layer is the name of the shapefile without the .shp extension. It will read (readOGR) and write (writeOGR) all the component files (.dbf, .shp, .prj etc)
Problem solved! Thank you again for those who help!
Here is what I ended up doing:
As Mnel wrote, this line will create the shapefile.
writeOGR(shape1, dsn = 'C:/TEST', layer ='newstuff', driver = 'ESRI Shapefile')
However, when I ran this line, it came back with this error:
Can't convert columns of class: AsIs; column names: ct2,mprop,mlot,mliv
This is because my attribute data was not numeric, but were characters. Luckily, my attribute data is all numbers so I ran transform() to fix this problem.
shape2 <-shape1
shape2#data <- transform(shape1#data, ct2 = as.numeric(ct2),
mprop = as.numeric(mprop),
mlot = as.numeric(mlot),
mliv = as.numeric(mliv))
I tried the writeOGR() command again, but I still didn't get the .prj file that I was looking for. The problem was I didn't specified the coordinate systems for the shapefile when I was importing the file. Since I already know what the coordinate system is, all I had to do was define it when importing.
readShapePoly('C:/TEST/blank_ct.shp',proj4string=CRS("+proj=longlat +datum=WGS84")
After that, I re-ran all the things I wanted to do with the shapefile, and the writeOGR line for exporting. And that's it!

Resources