I'm trying to retrieve link attributes but the response contains no data.
I'm using R but the area of interest is an area just West of London at:
latitude = 51.561
longitude = -0.4958
Using these lats and longs with the tilex and y formulas in the HERE API documention gives:
lat <- 51.561
long <- -0.4958
level <- 10
#tilex
tilex <- floor(long+180/(180/2^level))
#tiley
tiley <-floor(lat+90/(180/2^level))
The following is the call to the API.
resource <- "tile"
base_url <- paste0("https://pde.api.here.com/1/", resource, ".json?")
layer <- 'LINK_ATTRIBUTE_FC2'
level <- '10'
query <- list(app_id = app_id,
app_code = app_code,
region="WEU",
layer = layer,
level = level,
tilex = tileX,
tiley = tiley)
request <- GET(url = base_url, query = query, verbose(), add_headers(headers = c('Accept-Encoding' = "gzip, deflate")))
response <- content(request, as = "text", encoding = "utf-8")
fromJSON(response, simplifyDataFrame = TRUE)
This returns an empty list. Any ideas what I'm doing wrong?
you have an empty list because Latitude:-51.561 longitude: -0.4958 is east of Falkland islands, in the south of the Atlantic ocean.
try 51.561, -0.4958 and you'll get 619 rows in the answer.
Related
I am facing the above problem while running the following codes. I have installed the following packages:-
Geocoding: OpenStreetMap API Search Functions:
# INPUT LOCATION VARIABLES
# [RECORD_ID], [ADDRESS], [CITY], [STATE], [ZIPCODE]
geocode <- function(record_id, address, city, state, zipcode){
# NOMINATIM SEARCH API URL
src_url <- "https://nominatim.openstreetmap.org/ui/search.html?q="
### INPUTS PREPARATION ###
city <- str_replace_all(string = city,
pattern = "\\s|,",
replacement = "+")
# CREATE A FULL ADDRESS
addr <- paste(address, city, state, zipcode, sep = "%2C")
# CREATE A SEARCH URL BASED ON NOMINATIM API TO RETURN GEOJSON
requests <- paste0(src_url, addr, "&format=geojson")
# ITERATE OVER THE URLS AND MAKE REQUEST TO THE SEARCH API
for (i in 1:length(requests)) {
# MAKE HTML REQUEST TO API AND TRANSFORM HTML RESPONSE TO JSON
response <- read_html(requests[i]) %>%
html_node("p") %>%
html_text() %>%
fromJSON()
# FROM THE RESPONSE EXTRACT LATITUDE AND LONGITUDE COORDINATES
lon <- response$features$geometry$coordinates[[1]][1]
lat <- response$features$geometry$coordinates[[1]][2]
# CREATE A COORDINATES DATAFRAME
if (i == 1) {
loc <- tibble(record_id = record_id[i],
address = str_replace_all(addr[i], "%2C", ","),
latitude = lat, longitude = lon)
}else{
df <- tibble(record_id = record_id[i],
address = str_replace_all(addr[i], "%2C", ","),
latitude = lat, longitude = lon)
loc <- bind_rows(loc, df)
}
}
return(loc)
}
Data Collection: OpenStreeMaps API Request
df <- geocode(record_id = bldg_df$record_id,
address = query,
city = bldg_df$city,
state = bldg_df$state,
zipcode = bldg_df$zipcode)
head(df)
While running the above cell I am getting the following error:
Error in if (is.character(txt) && length(txt) == 1 && nchar(txt, type = "bytes") < : missing value where TRUE/FALSE needed
3.
fromJSON(.)
2.
read_html(requests[i]) %>% html_node("p") %>% html_text() %>% fromJSON()
1.
geocode(record_id = bldg_df$record_id, address = query, city = bldg_df$city, state = bldg_df$state, zipcode = bldg_df$zipcode)
Making the code minimal as possible:
geocode <- function(record_id, address, city, state, zipcode){
src_url <- "https://nominatim.openstreetmap.org/ui/search.html?q="
city <- str_replace_all(string = city,
pattern = "\\s|,",
replacement = "+")
addr <- paste(address, city, state, zipcode, sep = "%2C")
requests <- paste0(src_url, addr, "&format=geojson")
for (i in 1:length(requests)) {
response <- read_html(requests[i]) %>%
html_node("p") %>%
q<- html_text()
}
return(q)
}
q
Removed the JSON() and stored the html_text() to a variable q. Got the following response:
function (save = "default", status = 0, runLast = TRUE)
.Internal(quit(save, status, runLast))
<bytecode: 0x0000022acf675480>
<environment: namespace:base>
My code takes the two destination airports (JFK and then Las Vegas), passes them through a URL to return flight information in the For Loop, which I'm trying to add to a data frame. However, it only is including the results from the last element, Las Vegas. Should I use something other than a list for this?
library (httr)
library (jsonlite)
des <- c("JFK", "LAS")
flights = NULL
flights = list()
for (x in 1 : length(des))
{
url <- paste0("https://travelpayouts-travelpayouts-flight-data-v1.p.rapidapi.com/v1/prices/direct/?destination=", des[x], "&origin=BOS")
r<-GET(url, add_headers("X-RapidAPI-Host" = "travelpayouts-travelpayouts-flight-data-v1.p.rapidapi.com",
"X-RapidAPI-Key" = " MY KEY HERE ",
"X-Access-Token" = " MY TOKEN HERE"))
jsonResponseParsed<-content(r,as="text")
f <- fromJSON(jsonResponseParsed, flatten = TRUE)
flights[[x]] <- data.frame(f$data)
}
data = do.call(rbind, flights)
#price will be in rubles will need to convert to USD
I am using the help of https://ipstack.com to geocode IP addresses and am having a difficult time trying to geocode all 1200 addresses in a short amount of time.
With R, I've collected the URLs into a list (e.g. http://api.ipstack.com/[IP address]?access_key=[access key]) and can use read_json to read the json data of each URL. But I've not been able to develop a loop to extract the data from each URL.
library(RCurl)
library(jsonlite)
x <- c("http://api.ipstack.com/178.140.119.217?access_key=[access_key]", "http://api.ipstack.com/68.37.21.125?access_key=[access_key]", "http://api.ipstack.com/68.10.255.89?access_key=[access_key]")
read_json(x)
Error in file(path) : invalid 'description' argument
I'm looking for a solution that will be able to read multiple IP addresses and then attach the information to a dataframe.
*Edit 1: Still stuck, but I'm making some progress with the loop,
library(RCurl)
library(jsonlite)
url_lst = as.character(df$URL)
output = NULL
for (i in url_lst) {
x = as.data.frame(read_json(i))
output = rbind(output,x)
}
However, this results in an error:
Error in (function (..., row.names = NULL, check.rows = FALSE, check.names = TRUE, : arguments imply differing number of rows: 1, 0
As well, the code only produces 8 observations rather than 1200.
*Edit 2: Bill Ash's answer got me further than I was, but it looks like some values in the JSON data aren't allowing the code to be successful.
Bill Ash's code:
library(httr)
library(tibble)
library(purrr)
library(jsonlite)
ip_addresses <- core_members$ip_address
# a simple function
ip_locate <- function(your_vector_of_ip_addresses, access_key) {
ip <- your_vector_of_ip_addresses
map_df(ip, ~{
out <- httr::GET(url = paste0("http://api.ipstack.com/", .,
"?access_key=", access_key))
resp <- fromJSON(httr::content(out, "text"), flatten = TRUE)
tibble::tibble(ip = resp$ip,
country = resp$country_name,
region = resp$region_name,
city = resp$city,
zip = resp$zip,
lat = resp$latitude,
lng = resp$longitude)
})
}
ip_info <- ip_locate(your_vector_of_ip_addresses = ip_addresses,
access_key = "[access_key]")
# output
ip_info %>%
head()
Where the error begins
ip_info <- ip_locate(your_vector_of_ip_addresses = ip_addresses,
access_key = "[access_key]")
Error: All columns in a tibble must be 1d or 2d objects:
* Column `zip` is NULL
9.
stop(cnd)
8.
abort(error_column_must_be_vector(names_x[is_xd], classes))
7.
check_valid_cols(x)
6.
lst_to_tibble(xlq$output, .rows, .name_repair, lengths = xlq$lengths)
5.
tibble::tibble(ip = resp$ip, country = resp$country_name, region = resp$region_name,
city = resp$city, zip = resp$zip, lat = resp$latitude, lng = resp$longitude)
4.
.f(.x[[i]], ...)
3.
map(.x, .f, ...)
2.
map_df(ip, ~{
out <- httr::GET(url = paste0("http://api.ipstack.com/",
., "?access_key=", access_key))
resp <- fromJSON(httr::content(out, "text"), flatten = TRUE) ...
1.
ip_locate(your_vector_of_ip_addresses = ip_addresses, access_key = "[access_key]")
Because I only need the coordinates from these IP addresses, I believe this has been resolved. Hopefully, someone is willing to continue advising on this issue, but I won't be updating this any further.
It looks like you can pay for bulk lookup as well.
From there documentation page:
Bulk IP Lookup
The ipstack API also offers the ability to request data for multiple IPv4 or IPv6 addresses at the same time. In order to process IP addresses in bulk, simply append multiple comma-separated IP addresses to the API's base URL.
library(httr)
library(tibble)
library(purrr)
library(jsonlite)
# some ip addresses
ip_addresses <- c("178.140.119.217", "68.37.21.125", "68.10.255.89")
# a simple function
ip_locate <- function(your_vector_of_ip_addresses, access_key) {
ip <- your_vector_of_ip_addresses
map_df(ip, ~{
out <- httr::GET(url = paste0("http://api.ipstack.com/", .,
"?access_key=", access_key))
resp <- fromJSON(httr::content(out, "text"), flatten = TRUE)
tibble::tibble(ip = resp$ip,
country = resp$country_name,
region = resp$region_name,
city = resp$city,
zip = resp$zip,
lat = resp$latitude,
lng = resp$longitude)
})
}
# an example
ip_info <- ip_locate(your_vector_of_ip_addresses = ip_addresses,
access_key = "had to edit out my key")
# output
ip_info %>%
head()
# A tibble: 3 x 7
ip country region city zip lat lng
<chr> <chr> <chr> <chr> <chr> <dbl> <dbl>
1 178.140.119.217 Russia Moscow Moscow 101001 55.8 37.6
2 68.37.21.125 United States Michigan Southgate 48195 42.2 -83.2
3 68.10.255.89 United States Virginia Chesapeake 23323 36.8 -76.3
Hope this helps.
I have a list of American states stored in a .csv file:
cities <- read.csv("cities.csv")
states <- fetch_cities$states
df_total_US <- data.frame(state = character(), lon = character(), lat = character())
states <- as.character(cities$States)
Now I would like to loop over the states and fetch their longitude and latitude value. Therefore I do:
for(state in states){
df_temp <- geocode(state)
df_add <- data.frame(state = state, lon = df_temp$lon[1], lat = df_temp$lat[1])
df_total_US <- rbind(df_add, df_total_US)
Sys.sleep(5)
}
This however gives several warnings() saying:
geocode failed with status OVER_QUERY_LIMIT, location = ...
I thought I could get rid of this error by including:\
sys.sleep(5)
But clearly this is not working. Any suggestions for what I should do to scrape the coordinates of a list?
I am currently trying to configure the rnoaa library to connect city, state data with a weather station, and therefore output ANNUAL weather data, namely temperature. I have included a hardcoded input for reference, but I intend on feeding in hundreds of geocoded cities eventually. This isn't the issue so much as it is retrieving data.
require(rnoaa)
require(ggmap)
city<-geocode("birmingham, alabama", output = "all")
bounds<-city$results[[1]]$geometry$bounds
se<-bounds$southwest$lat
sw<-bounds$southwest$lng
ne<-bounds$northeast$lat
nw<-bounds$northeast$lng
stations<-ncdc_stations(extent = c(se, sw, ne, nw),token = noaakey)
I am calculating a MBR (rectangle) around the geographic area, in this case Birmingham, and then getting a list of stations. I'm then pulling out the station_id and then attempting to retrieve results with any type of parameters with no success. I'm looking to associate annual temperatures with each city.
test <- ncdc(datasetid = "ANNUAL", locationid = topStation[1],
datatypeid = "DSNW",startdate = "2000-01-01", enddate = "2010-01-01",
limit = 1000, token = noaakey)
Warning message:
Sorry, no data found
Looks like location ID is creating issue. Try without it ( as it is optional field )
ncdc_locs(datasetid = "ANNUAL",datatypeid = "DSNW",startdate = "2000-01-01", enddate = "2010-01-01", limit = 1000,token = <your token key>)
and then with valid location ID
ncdc_locs(datasetid = "ANNUAL",datatypeid = "DSNW",startdate = "2000-01-01", enddate = "2010-01-01", limit = 1000,locationid='CITY:US000001',token = <your token>)
returns
$meta
NULL
$data
mindate maxdate name datacoverage id
1 1872-01-01 2016-04-16 Washington D.C., US 1 CITY:US000001
attr(,"class")
[1] "ncdc_locs"