How to exactly reproduce historical ORS-Isochrones? [duplicate] - r

I am working with the gmapsdistance package in R. I have my API key, and I am familiar with the functions within the package.
However, I would like to work out a problem in the reverse direction. Instead of just finding the Time, Distance, and Status between lat/longs are vectors of lat/longs, I would like to input a lat/long, and draw a region of all points that could be driven to in 3 hours or less. Then I'd like to draw this on a Google map.
To start, it would be great to use Marimar, FL: 25.9840, -80.2821.
Does anyone have experience with that type of problem?

As suggested in the comments, you can sign up to a service like Travel Time Platform (which I'm using in this example) and use their API to get the possible destinations given a starting point.
Then you can plot this on a map using Google Maps (in my googleway package)
appId <- "TravelTime_APP_ID"
apiKey <- "TravelTime_API_KEY"
mapKey <- "GOOGLE_MAPS_API_KEY"
library(httr)
library(googleway)
library(jsonlite)
location <- c(25.9840, -80.2821)
driveTime <- 2 * 60 * 60
## London example
## location <- c(51.507609, -0.128315)
## sign up to http://www.traveltimeplatform.com/ and get an API key
## and use their 'Time Map' API
url <- "http://api.traveltimeapp.com/v4/time-map"
requestBody <- paste0('{
"departure_searches" : [
{"id" : "test",
"coords": {"lat":', location[1], ', "lng":', location[2],' },
"transportation" : {"type" : "driving"} ,
"travel_time" : ', driveTime, ',
"departure_time" : "2017-05-03T08:00:00z"
}
]
}')
res <- httr::POST(url = url,
httr::add_headers('Content-Type' = 'application/json'),
httr::add_headers('Accept' = 'application/json'),
httr::add_headers('X-Application-Id' = appId),
httr::add_headers('X-Api-Key' = apiKey),
body = requestBody,
encode = "json")
res <- jsonlite::fromJSON(as.character(res))
pl <- lapply(res$results$shapes[[1]]$shell, function(x){
googleway::encode_pl(lat = x[['lat']], lon = x[['lng']])
})
df <- data.frame(polyline = unlist(pl))
df_marker <- data.frame(lat = location[1], lon = location[2])
google_map(key = mapKey) %>%
add_markers(data = df_marker) %>%
add_polylines(data = df, polyline = "polyline")

If you want to render in leaflet and use a free isochrone service, this is a pretty neat option. There is a limit of 2 hours drive away though.
devtools::install_github("tarakc02/rmapzen")
library(rmapzen)
Sys.setenv(MAPZEN_KEY = "") # get for free at https://mapzen.com/
marimar <- mz_geocode("Marimar, FL")
isos <- mz_isochrone(
marimar,
costing_model = mz_costing$auto(),
contours = mz_contours(c(60 * 2)) # 2 hours
)
library(leaflet)
leaflet(as_sp(isos)) %>%
addProviderTiles("CartoDB.DarkMatter") %>%
addPolygons(color = ~paste0("#", color), weight = 1)

Related

Error in parse_url(url) : length(url) == 1 is not TRUE

I am trying to use google distance matrix api to receive the coordinates from a list of address (2289 addresses in total). I am trying to pull out the coordinates in latitude and longitude to each address.
# ADDRESS : a list of the N adresses to be geocoded
# LON /LAT : two matrices, size [nx1],initialised to contain only 0
Address <- as.matrix(Coordinates$Origin)
LON = matrix(0, length(Address), 1)
LAT = matrix(0, length(Address), 1)
View(LAT)
for (i in seq(1,length(Address))){
APIstring = c("https://maps.googleapis.com/maps/api/geocode/json?address=",
Address[i],",&key=AIzaSyCevHB7yTBuiDbdHd8DwE64ZvWM-NZH79s")
res = GET(APIstring)
tmp = fromJSON(content(res, as = "text"))
LAT[i] =tmp$results$geometry$location$lat
LON[i] =tmp$results$geometry$location$lng
}
Error in parse_url(url) : length(url) == 1 is not TRUE
Likely, your problems arises because your code trying to vectorize the GET call is not properly integrated into the loop. You might want to handle URLs one by one, and could do so using functional programming (faster than your loop, too). How about the following based on the suggestion of #Limey?
library(tidyverse)
get_lon_and_lat <- function(.address){
.api_string <- paste0(
"https://maps.googleapis.com/maps/api/geocode/json?address=",
.address,
",&key=[*insert key here*]")
.res <- GET(.api_string)
.res <- fromJSON(content(.res, as = "text"))
.out <- data.frame(
lat = .res$results$geometry$location$lat,
lon = .res$results$geometry$location$lng
)
return(.out)
}
result_lat_lon <- lapply(Address, get_lon_and_lat) %>%
bind_rows()
PS: You might want to remove your API key from your question for security reasons.

How do I convert an API response to a Polygon object in R? Graphhopper API in R - Isochrones

I am new to API's and R, and I was wondering how to use this GraphHopper API.(https://graphhopper.com/api/1/docs/isochrone/https://graphhopper.com/api/1/docs/isochrone/)
At the page above there is this:
curl "https://graphhopper.com/api/1/isochrone?point=51.131108,12.414551&key=[YOUR_KEY]"
Is there a way to make to convert the response to a polygon object?
So far i got here, but I don't know how to convert the request to a Polygon:
library(httr)
library(jsonlite)
a = GET("https://graphhopper.com/api/1/isochrone?point=51.131108,12.414551&key=KEY")
class(a)
a$status_code
This worked..
library(RJSONIO)
library(sp)
library(leaflet)
## GraphHopper API
# https://graphhopper.com/api/1/docs/isochrone/
#Request
a <-fromJSON("https://graphhopper.com/api/1/isochrone?point=51.131108,12.414551&key=[GET YOUR OWN KEY]")
#Response
x = a$polygons$geometry$coordinates
#Response Manipulation
x = data.frame(unlist(x))
m = nrow(x)/2
x1 = x[1:m,1]
x2 = x[(1+m):nrow(x),1]
x0 = data.frame(cbind(x1,x2))
#Polygon plotting on Leaflet
p = Polygon(coords = x0)
leaflet()%>%
addTiles()%>%
addPolygons(data = p)

Errors attempting to applying function to data frame rows in R

I wrote a very simple function (which works well) that returns the timezone given a set of coordinates:
library(XML)
findTZ <- function(lon, lat, date=Sys.Date())
{ apiurl <- sprintf("https://maps.googleapis.com/maps/api/timezone/%s?location=%s,%s&timestamp=%d&sensor=%s",
"xml", lat, lon, as.numeric(as.POSIXct(date)), "false")
TZ <- xmlParse(readLines(apiurl))[["string(//time_zone_id)"]]
return(TZ)
}
findTZ(-112.86, 53.61) # example
However, when I try to run the function on a list of coordinates in a dataframe I get an error: Error in file(con, "r") : invalid 'description' argument
Any hints at what I'm getting wrong here? It seems like it should be very simple.
Here's the very basic data I'm testing on:
DF <- data.frame(
longitude = c(-122, -112, -102),
latitude = c(54, 53, 52)
)
DF$timezone = findTZ(lon=DF$longitude, lat=DF$latitude)
Thank you for any pointers!
EDIT / ADDITION
After implementing the answer from #Floo0 I tried implementing the same solution with another function for calculating sunrise/set times using the same location data (and that I want to return in local time, hence the timezone function).
Here's the sunrise function:
library(maptools)
SSun <- function(lon, lat, date, deg=0, dir, tzone)
{ # deg = solar depth: rise/set=0, civil=6, nautical=12, astronomical=18
# dir = direction: sunrise="dawn", sunset="dusk"
# tzone = time zone of output, NOT of location
siteX <- SpatialPoints(matrix(c(lon, lat), nrow=1), proj4string=CRS("+proj=longlat +datum=WGS84"))
dateX <- as.POSIXct(date, tz=tzone)
duskX <- crepuscule(siteX, dateX, solarDep=deg, direction=dir, POSIXct.out=TRUE)
duskX <- duskX$time # keep only date and time, discard day_frac
return(duskX)
}
SSun(-112.86, 53.61, "2016-09-25", deg=0, dir="dawn", tzone="America/Edmonton") # example
And the updated timezone function:
library(tidyverse); library(xml2)
findTZ <- function(lon, lat, date=Sys.Date()){
apiurl <- sprintf("https://maps.googleapis.com/maps/api/timezone/%s?location=%s,%s&timestamp=%d&sensor=%s",
"xml", lat, lon, as.numeric(as.POSIXct(date)), "false")
read_xml(apiurl) %>% xml_find_first(".//time_zone_id") %>% xml_text
}
findTZ(-112.86, 53.61) # example
And the code I used to call both functions:
DF %>% mutate(date = as.POSIXct(date),
TZ = map2_chr(longitude, latitude, findTZ),
sunrise = SSun(longitude, latitude, date, deg=0, dir="dawn", tzone=TZ))
I feel like I must be misunderstanding how this works. Any insights?
You can do the following (using xml2 instead of XML as i find it easier to use)
require(xml2)
findTZ <- function(lon, lat, date=Sys.Date()){
apiurl <- sprintf("https://maps.googleapis.com/maps/api/timezone/%s?location=%s,%s&timestamp=%d&sensor=%s",
"xml", lat, lon, as.numeric(as.POSIXct(date)), "false")
read_xml(apiurl) %>% xml_find_first(".//time_zone_id") %>% xml_text
}
To loop through your test data you can use:
require(tidyverse)
DF %>% mutate(TZ = map2_chr(longitude, latitude, findTZ))
Which gives you:
longitude latitude TZ
1 -122 54 America/Vancouver
2 -112 53 America/Edmonton
3 -102 52 America/Regina
As #Rich Scriven points out correctly you need to loop through the data somewhere. This loop is "hidden" in the map2_chr call.
Consider mapply to pass each pair of element-wise values into function to return a vector:
DF$timezones <- mapply(findTZ, DF$longitude, DF$latitude)

Using R to identify nearest point to a location and calculate the distance between them along a network/road

I have a series of locations (Points_B) and would like to find the closest point to them from a different set of points (Points_A) and the distance between them in kms. I can do this as the crow flies but cannot work out how to do the same along a road network (the 'Roads' object in the code). The code I have so far is a follows:
library(sp)
library(rgdal)
library(rgeos)
download.file("https://dl.dropboxusercontent.com/u/27869346/Road_Shp.zip", "Road_Shp.zip")
#2.9mb
unzip("Road_Shp.zip")
Roads <- readOGR(".", "Subset_Roads_WGS")
Points_A <- data.frame(ID = c("A","B","C","D","E","F","G","H","I","J","K","L"), ID_Lat = c(50.91487, 50.92848, 50.94560, 50.94069, 50.92275, 50.94109, 50.92288, 50.92994, 50.92076, 50.90496, 50.89203, 50.88757), ID_Lon = c(-1.405821, -1.423619, -1.383509, -1.396910, -1.441801, -1.459088, -1.466626, -1.369458, -1.340104, -1.360153, -1.344662, -1.355842))
rownames(Points_A) <- Points_A$ID
Points_B <- data.frame(Code = 1:30, Code_Lat = c(50.92658, 50.92373, 50.93785, 50.92274, 50.91056, 50.88747, 50.90940, 50.91328, 50.91887, 50.92129, 50.91326, 50.91961, 50.91653, 50.90910, 50.91432, 50.93742, 50.91848, 50.93196, 50.94209, 50.92080, 50.92127, 50.92538, 50.88418, 50.91648, 50.91224, 50.92216, 50.90526, 50.91580, 50.91203, 50.91774), Code_Lon = c(-1.417311, -1.457155, -1.400106, -1.374250, -1.335896, -1.362710, -1.360263, -1.430976, -1.461693, -1.417107, -1.426709, -1.439435, -1.429997, -1.413220, -1.415046, -1.440672, -1.392502, -1.459934, -1.432446, -1.357745, -1.374369, -1.458929, -1.365000, -1.426285, -1.403963, -1.344068, -1.340864, -1.399607, -1.407266, -1.386722))
rownames(Points_B) <- Points_B$Code
Points_A_SP <- SpatialPoints(Points_A[,2:3])
Points_B_SP <- SpatialPoints(Points_B[,2:3])
Distances <- (gDistance(Points_A_SP, Points_B_SP, byid=TRUE))*100
Points_B$Nearest_Points_A_CF <- colnames(Distances)[apply(Distances,1,which.min)]
Points_B$Distance_Points_A_CF <- apply(Distances,1,min)
The output I am after would be two additional columns in 'Points_B' with 1) having the nearest Point A object ID along the road network and 2) having the distance along the network in km. Any help would be appreciated. Thanks.
I've been working on this kind of problem all day. Try mapdist() in the ggmap package and see if this works:
library(dplyr)
library(ggmap)
#Your data
Points_A <- data.frame(ID = c("A","B","C","D","E","F","G","H","I","J","K","L"), ID_Lat = c(50.91487, 50.92848, 50.94560, 50.94069, 50.92275, 50.94109, 50.92288, 50.92994, 50.92076, 50.90496, 50.89203, 50.88757), ID_Lon = c(-1.405821, -1.423619, -1.383509, -1.396910, -1.441801, -1.459088, -1.466626, -1.369458, -1.340104, -1.360153, -1.344662, -1.355842))
Points_B <- data.frame(Code = 1:30, Code_Lat = c(50.92658, 50.92373, 50.93785, 50.92274, 50.91056, 50.88747, 50.90940, 50.91328, 50.91887, 50.92129, 50.91326, 50.91961, 50.91653, 50.90910, 50.91432, 50.93742, 50.91848, 50.93196, 50.94209, 50.92080, 50.92127, 50.92538, 50.88418, 50.91648, 50.91224, 50.92216, 50.90526, 50.91580, 50.91203, 50.91774), Code_Lon = c(-1.417311, -1.457155, -1.400106, -1.374250, -1.335896, -1.362710, -1.360263, -1.430976, -1.461693, -1.417107, -1.426709, -1.439435, -1.429997, -1.413220, -1.415046, -1.440672, -1.392502, -1.459934, -1.432446, -1.357745, -1.374369, -1.458929, -1.365000, -1.426285, -1.403963, -1.344068, -1.340864, -1.399607, -1.407266, -1.386722))
#Combine coords into one field (mapdist was doing something funny with the commas so I had to specify "%2C" here)
Points_A$COORD <- paste(ID_Lat, ID_Lon, sep="%2C")
Points_B$COORD <- paste(Code_Lat, Code_Lon, sep="%2C")
#use expand grid to generate all combos
get_directions <- expand.grid(Start = Points_A$COORD,
End = Points_B$COORD,
stringsAsFactors = F,
KEEP.OUT.ATTRS = F) %>%
left_join(select(Points_A, COORD, ID), by = c("Start" = "COORD")) %>%
left_join(select(Points_B, COORD, Code), by = c("End" = "COORD"))
#make a base dataframe
route_df <- mapdist(from = get_directions$Start[1],
to = get_directions$End[1],
mode = "driving") %>%
mutate(Point_A = get_directions$ID[1],
Point_B = get_directions$Code[1])
#get the rest in a for-loop
start <- Sys.time()
for(i in 2:nrow(get_directions)){
get_route <- mapdist(from = get_directions$Start[i],
to = get_directions$End[i],
mode = "driving") %>%
mutate(Point_A = get_directions$ID[i],
Point_B = get_directions$Code[i])
route_df <<- rbind(route_df, get_route) #add to your original file
Sys.sleep(time = 1) #so google doesn't get mad at you for speed
end <- Sys.time()
print(paste(i, "of", nrow(get_directions),
round(i/nrow(get_directions),4)*100, "%", sep=" "))
print(end-start)
}
#save if you want
write.csv(route_df, "route_df.csv", row.names = F)
#Route Evaluation
closest_point <-route_df %>%
group_by(Point_A) %>%
filter(km == min(km)) %>%
ungroup()
I'm still kind of new at this so there may be a better way to do the data wrangling. Hope this helps & good luck
The packages igraph, osmr and walkalytics all seem to provide this functionality these days. Mode-specific routing networks exist (in various degrees of functionality).

Mapdist: Error is.character(from)

My dataset includes a column "pickup" corresponding to the starting coordinates and a "dropoff" for the ending coordinates, of a trip. Like:
pickup dropoff
40.77419,-73.872608 40.78055,-73.955042
40.7737,-73.870721 40.757007,-73.971953
I want to calculate the shortest route suggested by Google Maps, and saved the calculations in a new column. This is what I'm doing:
X$GoogleDist <- mapdist(from= list(X$pickup),
to = list(X$dropoff),
mode = "driving" ,
output = "simple", messaging = FALSE, sensor = FALSE,
language = "en-EN", override_limit = FALSE)
Which gives me the following error:
Error: is.character(from) is not TRUE
You could do
library(ggmap)
X <- read.table(header=TRUE, text="pickup dropoff
40.77419,-73.872608 40.78055,-73.955042
40.7737,-73.870721 40.757007,-73.971953")
X <- as.data.frame(lapply(X, function(x) sapply(as.character(x), function(y) URLencode(y, T) ) ), stringsAsFactors = F)
rownames(X) <- NULL
res <- mapdist(from= X$pickup,
to = X$dropoff,
mode = "driving" ,
output = "simple", messaging = FALSE, sensor = FALSE,
language = "en-EN", override_limit = FALSE)
cbind(X, res)
# pickup dropoff from to m km miles seconds minutes hours
# 1 40.77419%2C-73.872608 40.78055%2C-73.955042 40.77419%2C-73.872608 40.78055%2C-73.955042 12805 12.805 7.957027 1212 20.20 0.3366667
# 2 40.7737%2C-73.870721 40.757007%2C-73.971953 40.7737%2C-73.870721 40.757007%2C-73.971953 14038 14.038 8.723213 1437 23.95 0.3991667
Your columns are probably of type factor (check with str(X)). mapdist needs character vectors (check ?mapdist). So you have to convert the columns using as.character beforehand. Also, when using geo coordinates, I think you got to URL encode them. I.e. the comma , becomes %2C. Otherwise it didn`t work for me...

Resources