I'm trying to geocode different IATA airport codes in Italy, with the following (rudimentary) code in ggmap (version 2.4)
#list of all IATA codes
geo_apt <- c("AOI", "BGY", "BLQ", "BRI", "CTA", "FCO", "LIN", "MXP", "NAP",
"PMF", "PSA", "PSR", "RMI", "TRN", "VCE", "VRN")
#preparing an empty dataframe to store the geocodes
apt_geo <- data.frame(IATA=rep(NA,16), lon=rep(NA,16), lat=rep(NA,16))
#geocoding the codes
for (i in seq_along(geo_apt)) {
apt_geo[i,1] <- geo_apt[i]
apt_geo[i,2] <- (geocode(paste(geo_apt[i],"airport")))[1]
apt_geo[i,3] <- (geocode(paste(geo_apt[i],"airport")))[2]
}
and the geocode function of ggmap works perfectly fine with all of these codes except "PSR"
IATA lon lat
1 AOI 13.363752 43.61654
2 BGY 9.703631 45.66957
3 BLQ 11.287859 44.53452
4 BRI 16.765202 41.13751
5 CTA 15.065775 37.46730
6 FCO 12.246238 41.79989
7 LIN 9.276308 45.45218
8 MXP 8.725531 45.63006
9 NAP 14.286579 40.88299
10 PMF 10.295935 44.82326
11 PSA 10.397884 43.68908
12 PSR -81.117259 33.94855 #<- doens't work
13 RMI 12.618819 44.02289
14 TRN 7.647867 45.19654
15 VCE 12.339771 45.50506
16 VRN 10.890141 45.40000
I've tried to use revgeocode and those coordinates correspond to the following address:
revgeocode(as.numeric(apt_geo[12,2:3]))
#Information from URL : http://maps.googleapis.com/maps/api/geocode/json?latlng=33.948545,-81.1172588&sensor=false
[1] "Kentucky Avenue, West Columbia, SC 29170, USA"
On the contrary, if I go to Google maps, it works perfectly fine:
Does anybody have a clue on this apparently strange phenomenon?
EDIT
Following one suggestion in the comments below, I tried to use geocode(italy PSR airport) on version 2.4 again and instead of throwing a more accurate result or even the same result, this is the warning I got:
geocode("italy PSR airport")
lon lat
1 NA NA
Warning message:
geocode failed with status ZERO_RESULTS, location = "italy PSR airport"
while with the attempt airport PSR the coordinates are even different from those of PSR airport (at least this time it's an actual airport, although its IATA code is LEX instead of PSR).
revgeocode(as.numeric(geocode("airport PSR")))
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?latlng=38.0381454,-84.5970727&sensor=false
[1] "3895 Terminal Drive, Lexington, KY 40510, USA"
The whole question is a possible duplicate
Nonetheless, I don't get the reason for which the API and Google maps are using different datasets...
Related
I would like to reverse geo code address and pin code in R
These are the columns
A B C
15.3859085 74.0314209 7J7P92PJ+9H77QGCCCC
I have taken first four rows having columns A B and C among 1000's of rows
df<-ga.data[1:4,]
df <- cbind(df,do.call(rbind,
lapply(1:nrow(df),
function(i)
revgeocode(as.numeric(
df[i,3:1]), output = "more")
[c("administrative_area_level_1","locality","postal_code","address")])))
Error in revgeocode(as.numeric(df[i, 3:1]), output = "more") :
is.numeric(location) && length(location) == 2 is not TRUE
Also is there any other package or approach to find out the address and pincode most welcome
I also tried the following
When I tried using ggmap I got this error
In revgeocode(as.numeric(df[i, c("Latitude", "Longitude")]), output = "address") :
HTTP 400 Bad Request
Also i tried this
revgeocode(c(df$B[1], df$A[1]))
Warning Warning message: In revgeocode(c(df$Longitude[1],
df$Latitude[1])) : HTTP 400 Bad Request
Also I am from India and it does not work for me if i search for lat long of India. If I use lat long of US it gives me the exact address
seems fishy
data <- read.csv(text="ID, Longitude, Latitude
311175, 41.298437, -72.929179
292058, 41.936943, -87.669838
12979, 37.580956, -77.471439")
library(ggmap)
result <- do.call(rbind,
lapply(1:nrow(data),
function(i)revgeocode(as.numeric(data[i,3:2]))))
data <- cbind(data,result)
The current CRAN version of revgeo_0.15 does not have the revgeocode function. If you upgrade to this version, you'll find a revgeo function, which takes longitude, latitude arguments. Your column C should not be passed into the function.
revgeo::revgeo(latitude=df[, 'A'], longitude=df[, 'B'], output='frame')
[1] "Getting geocode data from Photon: http://photon.komoot.de/reverse?lon=74.0314209&lat=15.3859085"
housenumber street city state zip country
1 House Number Not Found Street Not Found Borim Goa Postcode Not Found India
I've been tying to get the distance between a list of home postcodes and a list of school postcodes for approximately 2,000 students. I'm using the gmapsdistance package within R to get this from the Google Maps Distance Matrix API. I've put in a valid API key and just replaced this in the following code for security reasons.
library(gmapsdistance)
set.api.key("valid API key")
results <- gmapsdistance(origin = school$HomePostcode,
destination = school$SchoolPostcode,
mode = "walking",
shape = "long")
However, this gives the following error code.
Error in function (type, msg, asError = TRUE) :
Unknown SSL protocol error in connection to maps.googleapis.com:443
Looking on the Google APIs website, it looks like it hasn't ran the query for all the data, it says that there were only 219 requests. I know I'm limited as to how many requests I can do in one day, but the limit is 2,500 and it's not even letting me get close to that.
I've tried running the code on one set of postcodes, like below;
test <- gmapsdistance(origin = "EC4V+5EX",
destination = "EC4V+3AL",
mode = "walking",
shape = "long")
Which gives the following, as I would expect.
$Time
[1] 384
$Distance
[1] 497
$Status
[1] "OK"
My data looks something like this, I've anonymised the data and removed all variables that aren't needed. There are 1,777 sets of postcodes.
head(school)
HomePostcode SchoolPostcode
1 EC4V+5EX EC4V+3AL
2 EC2V+7AD EC4V+3AL
3 EC2A+1WD EC4V+3AL
4 EC1V+3QG EC4V+3AL
5 EC2N+2PT EC4V+3AL
6 EC1M+5QA EC4V+3AL
I do not have enough reputation to comment but have you tried to set the parameter combinations to "pairwise". If set to "all" then it will compute all the combinations between one origin and all destinations.
library(gmapsdistance)
from <- c("EC4V+5EX", "EC2V+7AD", "EC2A+1WD", "EC1V+3QG", "EC2N+2PT", "EC1M+5QA")
to <- c("EC4V+3AL", "EC4V+3AL", "EC4V+3AL", "EC4V+3AL", "EC4V+3AL", "EC4V+3AL")
test <- gmapsdistance(origin=from,
destination=to,
combinations="pairwise",
key="YOURAPIKEYHERE",
mode="walking")
test$Distance
or de Distance
1 EC4V+5EX EC4V+3AL 497
2 EC2V+7AD EC4V+3AL 995
3 EC2A+1WD EC4V+3AL 2079
4 EC1V+3QG EC4V+3AL 2492
5 EC2N+2PT EC4V+3AL 1431
6 EC1M+5QA EC4V+3AL 1892
With this small set of 6 destinations it works, I have an API key, if you send me a bigger set I can try.
Another option would be to use the package googleway, it allows to set as well an API key. Example:
library(googleway)
test <- google_distance(origins = from,
destinations = to,
mode = "walking",
key="YOURAPIKEYHERE")
I'm trying to geocode a list of addresses, and I'm getting some INVALID_REQUEST errors, but I have no idea why. Check this out:
# First check if I have permission:
geocodeQueryCheck()
2478 geocoding queries remaining.
# Enter data
d <- c("Via del Tritone 123, 00187 Rome, Italy",
"Via dei Capocci 4/5, 00184 Rome, Italy")
# Ensure it's a character vector
class(d)
[1] "character"
# Try to geocode
library(ggmap)
geocode(d)
lon lat
1 NA NA
2 12.49324 41.89582
Warning message:
geocode failed with status INVALID_REQUEST, location = "Via del Tritone 123, 00187 Rome, Italy"
# Obtain an error, but if I try directly:
geocode("Via del Tritone 123, 00187 Rome, Italy")
lon lat
1 12.48813 41.90352
# It works. What gives?
A similar issue has been reported for RgoogleMaps::getGeoCode(), which was linked to Google's rate limiting. Since geocode() also relies on the Google Maps API (unless source = "dsk"), this limiting is likely causing problems here as well.
You can easily solve this the "stubborn" way by iterating through all locations of interest (eg. using for or *apply) rather than passing one large vector of addresses to geocode at once. Inside the loop, you can then use while to detect whether coordinates were successfully retrieved for the currently processed location and, if not, simply repeat the geocoding procedure until it succeeds.
out = lapply(d, function(i) {
gcd = geocode(i)
while (all(is.na(gcd))) {
gcd = geocode(i)
}
data.frame(address = i, gcd)
})
For example, during my last test run, the retrieval failed three times as indicated by the following warnings (this will likely look different on your machine):
Warning messages:
1: geocode failed with status OVER_QUERY_LIMIT, location = "Via del Tritone 123, 00187 Rome, Italy"
2: geocode failed with status OVER_QUERY_LIMIT, location = "Via del Tritone 123, 00187 Rome, Italy"
3: geocode failed with status OVER_QUERY_LIMIT, location = "Via dei Capocci 4/5, 00184 Rome, Italy"
Nonetheless, thanks to the while condition included inside the outer loop structure, coordinates were finally successfully retrieved for all locations of interest:
> do.call(rbind, out)
address lon lat
1 Via del Tritone 123, 00187 Rome, Italy 12.48766 41.90328
2 Via dei Capocci 4/5, 00184 Rome, Italy 12.49321 41.89582
As an additional treat, this "stubborn" approach can easily be run in parallel (eg. using parLapply() or foreach()), which might result in considerable speed gains when querying a larger number of addresses.
I'm trying to get the Zip codes of a (long) list of Longitude Latitude coordinates by using the revgeodcode function in the ggmap library.
My question & data are the same as here: Using revgeocode function in a FOR loop. Help required but the accepted answer does not work for me.
My data (.csv):
ID, Longitude, Latitude
311175, 41.298437, -72.929179
292058, 41.936943, -87.669838
12979, 37.580956, -77.471439
I follow the same steps:
data <- read.csv(file.choose())
dset <- as.data.frame(data[,2:3])
location = dset
locaddr <- lapply(seq(nrow(location)), function(i){
revgeocode(location[i,],
output = c("address"),
messaging = FALSE,
sensor = FALSE,
override_limit = FALSE)
})
... and get the error message: "Error: is.numeric(location) && length(location) == 2 is not TRUE"
Specifically, is.numeric(location) is FALSE, which seems strange because I can multiply by 2 and get the expected answer.
Any help would be appreciated.
There are lots of things wrong here.
First, you have latitude and longitude reversed. All the locations in your dataset, as specified, are in Antarctica.
Second, revgeocode(...) expects a numeric vector of length 2 containing the longitude and latitude in that order. You are passing a data.frame object (this is the reason for the error), and as per (1) it's in the wrong order.
Third, revgeocode(...) uses the google maps api, which limits you to 2500 queries a day. So if you really do have a large dataset, good luck with that.
This code works with your sample:
data <- read.csv(text="ID, Longitude, Latitude
311175, 41.298437, -72.929179
292058, 41.936943, -87.669838
12979, 37.580956, -77.471439")
library(ggmap)
result <- do.call(rbind,
lapply(1:nrow(data),
function(i)revgeocode(as.numeric(data[i,3:2]))))
data <- cbind(data,result)
data
# ID Longitude Latitude result
# 1 311175 41.29844 -72.92918 16 Church Street South, New Haven, CT 06519, USA
# 2 292058 41.93694 -87.66984 1632 West Nelson Street, Chicago, IL 60657, USA
# 3 12979 37.58096 -77.47144 2077-2199 Seddon Way, Richmond, VA 23230, USA
This extracts the zipcodes:
library(stringr)
data$zipcode <- substr(str_extract(data$result," [0-9]{5}, .+"),2,6)
data[,-4]
# ID Longitude Latitude zipcode
# 1 311175 41.29844 -72.92918 06519
# 2 292058 41.93694 -87.66984 60657
# 3 12979 37.58096 -77.47144 23230
I've written the package googleway to access google maps API with a valid API key. So if your data is greater than 2,500 items you can pay for an API key, and then use googleway::google_reverse_geocode()
For example
data <- read.csv(text="ID, Longitude, Latitude
311175, 41.298437, -72.929179
292058, 41.936943, -87.669838
12979, 37.580956, -77.471439")
library(googleway)
key <- "your_api_key"
res <- apply(data, 1, function(x){
google_reverse_geocode(location = c(x["Latitude"], x["Longitude"]),
key = key)
})
## Everything contained in 'res' is all the data returnd from Google Maps API
## for example, the geometry section of the first lat/lon coordiantes
res[[1]]$results$geometry
bounds.northeast.lat bounds.northeast.lng bounds.southwest.lat bounds.southwest.lng location.lat location.lng
1 -61.04904 180 -90 -180 -75.25097 -0.071389
location_type viewport.northeast.lat viewport.northeast.lng viewport.southwest.lat viewport.southwest.lng
1 APPROXIMATE -61.04904 180 -90 -180
To extract the zip code just write down:
>data$postal_code
Here http://www.bom.gov.au/climate/data/ I can enter a substation number, say 009572; choose the variable (say Temperature) and its type (say Maximum). Clicking "get data" brings me to a page with a link "All years of data". Click it, and you got a zip file. I am aware of this questions, but here I don't have a direct link to a zip file. Can something be done to automate weather data extraction from the Australian Bureau Of Meteorology website with R?
I had the same question and this S.O. question was one of the first pages to come up. After further searching I found the R package Bomrang (https://github.com/ropensci/bomrang) that:
Provides functions to interface with Australian Government Bureau of
Meteorology (BOM) data, fetching data and returning a tidy data frame
of précis forecasts, current weather data from stations, ag
information bulletins, historical weather data and downloading and
importing radar or satellite imagery.
Bomrang is apart of rOpenSci and is actively developed. It has a good set of functions:
Several functions are provided by bomrang to retrieve Australian
Bureau of Meteorology (BOM) data. A family of functions retrieve
weather data and return tidy data frames;
get_precis_forecast(), which retrieves the précis (short) forecast;
get_current_weather(), which fetches the current weather for a given station;
get_ag_bulletin(), which retrieves the agriculture bulletin;
get_weather_bulletin(), which retrieves the BOM 0900 or 1500 bulletins;
get_coastal_forecast(), which returns coastal waters forecasts; and
get_historical(), which retrieves historical daily observations for a given station.
A second group of functions retrieve information pertaining to
satellite and radar imagery,
get_available_imagery();
the satellite imagery itself, get_satellite_imagery();
get_available_radar(); and
the radar imagery itself, get_radar_imagery().
The function get_historical() seems to do what OP is needing. For example, to get the historical daily rainfall from a weather station in Sydney is as easy as:
> rain_066062 <- bomrang::get_historical(stationid = 066062,
+ type = 'rain',
+ meta = T)
> head(rain_066062)
$`meta`
# A tibble: 1 x 10
site name lat lon start end years percent AWS ncc_obs_code
<int> <chr> <dbl> <dbl> <date> <date> <dbl> <int> <chr> <chr>
1 66062 SYDNEY (OBSERVATORY HILL) -33.9 151. 1858-07-01 2018-11-01 160. 100 Y 136
$historical_data
Product_code Station_number Year Month Day Rainfall Period Quality
1 IDCJAC0009 66062 1858 1 1 NA NA
2 IDCJAC0009 66062 1858 1 2 NA NA
3 IDCJAC0009 66062 1858 1 3 NA NA
4 IDCJAC0009 66062 1858 1 4 NA NA
5 IDCJAC0009 66062 1858 1 5 NA NA
<<SNIP>>
Another nice feature is if you have the longitude and latitude of a place of interest, get_historical() will find the nearest weather station to that location.
To install from CRAN:
install.packages("bomrang")
Or install the development version from Github:
if (!require("remotes")) {
install.packages("remotes", repos = "http://cran.rstudio.com/")
library("remotes")
}
install_github("ropensci/bomrang", build_vignettes = TRUE)
Here's the code that I have done to download instantly and it also resolves your p_c problem. You can improve the function if you want and post.
#daily code = 136
#monthy code = 139
bomdata<- function(station,code){
for(i in 1: length(station)){
p.url<-paste("http://www.bom.gov.au/jsp/ncc/cdio/weatherData/av?p_stn_num=",station[i],"&p_display_type=availableYears&p_nccObsCode=",code,sep ="")
download.file(p.url,"test.txt")
filelist <- list.files(pattern = ".txt")
foo<- file(filelist,"r")
text<- suppressWarnings(readLines(foo))
close(foo)
l<- regexpr(":",text[1])
m<- unlist(gregexpr(",", text[1], perl = TRUE))
pc<- substr(text[1],l[[1]]+1,l[[1]]+(m[2]-(l[[1]]+1)))
url<-paste("http://www.bom.gov.au/jsp/ncc/cdio/weatherData/av?p_display_type=dailyZippedDataFile&p_stn_num=",station[i],"&p_c=",pc,"&p_nccObsCode=",code,"&p_startYear=2013", sep ="")
suppressWarnings(download.file(url,paste(station[i],".zip",sep= ""), mode = "wb"))
unlink("test.txt")
}
}
Example
bomdata(073137,136)
You can try this, it is a code sequence used by metvurst package. metvurst
## SET URL FOR DATA DOWNLOAD
url <- "http://www.bom.gov.au/ntc/IDO70004/IDO70004_"
## YEARS TO BE DOWNLOADED
yr <- 1993:2012
## READ DATA FOR ALL YEARS FROM URL INTO LIST
fijilst <- lapply(seq(yr), function(i) {
read.csv(paste(url, yr[i], ".csv", sep = ""), na.strings = c(-9999, 999))
})
While I still can't see how to do this with download.file(), the following almost does the job provided Chrome's "Ask where to save each file before downloading" is unticked.
system(paste('"C:/Documents and Settings/UserName/Local Settings/Application Data/Google/Chrome/Application/chrome.exe"',
'-url http://www.bom.gov.au/jsp/ncc/cdio/weatherData/av?p_display_type=dailyZippedDataFile&p_stn_num=009572&p_c=-18465084&p_nccObsCode=136'), wait = FALSE)
Then I could use paste0() and loop through various station numbers if I knew what p_c=-18465084 means and how it changes from station to station.