How to get lat long for indian cities in R? - r

I am trying to get lat long for Indian cities through ggmap package. Below is the code I have tried but it results in an error.
I created a vector with sample cities
library(ggmap)
mycities1<- c("Hyderabad","Chennai","Bangalore","Cochin","ARNHEM","London")
str(mycities1)
geocode(mycities1[1])
This is throwing out NA values though my city names are class of character.
Warning message: geocode failed with status OVER_QUERY_LIMIT, location
= "Hyderabad

Please change the source argument to dsk.
geocode(as.character(mycities1[1]), source = "dsk")
This is due to recent google-API changes.
There is also a current github-issue (or several actually) which adresses that problem.
geocode failed with status OVER_QUERY_LIMIT, location = "XXX" --> you
have not registered a correct and billing-enabled Google Maps API key
using register_google() (ggmap v2.7,903). Enabling billing is a
specific step after (!) adding your credit card information.

Related

Reverse geocode with open streetmaps

I found this question about converting a longitude and latitude into an address. The question uses ggmap's revgeocode which uses the google API. If I understood correctly, this is very costly.
I noticed that the following link about Accessing OpenStreetMap data with R, also uses ggmap. I was wondering if it is possible to use open street maps for revgeocode instead of the google API, or whether there is some other way to use OpenStreetMap to reverse geocode.
Data used for question:
humandate lat lon
09/10/2014 13:41 41.83174254 -75.87998774
09/10/2014 13:53 41.83189873 -75.87994957
Answer provided:
library(ggmap)
revgeocode(c(df$lon[1], df$lat[1]))
"27-37 Beech Street, Montrose, PA 18801, USA"
I found an answer for the API here, and I found this package is available in R: tmaptools.
I tried to do the following, but to no avail:
address <- tmaptools::rev_geocode_OSM(41.83174254 , -75.87998774)
Error in names(search_result_id) <- c("place_id", "osm_type", "osm_id", :
attempt to set an attribute on NULL

how to geocode the districts with NA outcome

I want to create a data frame in R with name of districts and their long and lat. For this purpose, I gave the command:
>locations_df <- mutate_geocode(cities_df, district)
This commands used the maps.googleapis.com for the purpose of geocoding, but for my 13 districts I am getting NA error. One of the error is pasted below:
geocode failed with status OVER_QUERY_LIMIT, location = "Sukkur"
How can I provide the geocode for missing values? I checked the name of missed cities on google map for spelling error but no such error was seen.
Thank you for the help.
Honestly your best bet may be to try running the query again. The results seem pretty idiosyncratic as far as I can tell (possibly related to dynamic IPs over WiFi?). These are my results from just now.
df %>% mutate_geocode(address)
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?address=Sukkur&sensor=false
address lon lat
1 Sukkur NA NA
Warning message:
geocode failed with status OVER_QUERY_LIMIT, location = "Sukkur"
So it failed for me too. I checked the queries I had left, then added "Paris" to check its results.
geocodeQueryCheck()
2499 geocoding queries remaining.
df
address
1 Sukkur
df[2,1]<-"Paris"
df %>% mutate_geocode(address)
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?address=Sukkur&sensor=false
Information from URL : http://maps.googleapis.com/maps/api/geocode/json?address=Paris&sensor=false
address lon lat
1 Sukkur 68.822808 27.72436
2 Paris 2.352222 48.85661
And now it works!
The issue may be helped by obtaining a Google Maps API key as this question suggests, which you can use if you install the GitHub version of ggmap.
The other option is to iterate through requests as an answer here suggests.

acs package in R: Cannot download dataset, error message is inscrutable

I am trying to use the acs package in R to download Census data for a basic map, but I am unable to download the data and I'm receiving a confusing error message.
My code is as follows:
#Including all packages here in case this is somehow the issue
install.packages(c("choroplethr", "choroplethrMaps", "tidycensus", "tigris", "leaflet", "acs", "sf"))
library(choroplethr)
library(choroplethrMaps)
library(tidycensus)
library(tigris)
library(leaflet)
library(acs)
library(sf)
library(tidyverse)
api.key.install("my_api_key")
SD_geo <- geo.make(state="CA", county = 73, tract = "*", block.group = "*")
median_income <- acs.fetch(endyear = 2015, span = 5, geography = SD_geo, table.number = "B19013", col.names="pretty")
Everything appears to work until the final command, when I receive the following error message:
trying URL 'http://web.mit.edu/eglenn/www/acs/acs-variables/acs_5yr_2015_var.xml.gz'
Content type 'application/xml' length 735879 bytes (718 KB)
downloaded 718 KB
Error in if (url.test["statusMessage"] != "OK") { :
missing value where TRUE/FALSE needed
In addition: Warning message:
In (function (endyear, span = 5, dataset = "acs", keyword, table.name, :
XML variable lookup tables for this request
seem to be missing from ' https://api.census.gov/data/2015/acs5/variables.xml ';
temporarily downloading and using archived copies instead;
since this is *much* slower, recommend running
acs.tables.install()
This is puzzling to me because 1) it appears like something is in fact being downloaded at first? and 2) 'Error in if (url.test["statusMessage"] != "OK") { :
missing value where TRUE/FALSE needed' makes no sense to me. It doesn't align with any of the arguments in the function.
I have tried:
Downloading the tables using acs.tables.install() as recommended in the second half of the error message. Doesn't help.
Changing the endyear and span to be sure that I'm falling within the years of data supported by the API. I seem to be, according to the API documentation. Have also used the package default arguments with no luck.
Using 'variable =' and the code for the variable as found in the official API documentation. This returns only the two lines with the mysterious "Error in if..." message.
Removing colnames = "pretty"
I'm going to just download the datafile as a CSV and read it into R for now, but I'd like to be able to perform this function from the script for future maps. Any information on what's going on here would be appreciated. I am running R version 3.3.2. Also, I'm new to using this package and the API. But I'm following the documentation and can't find evidence that I'm doing anything wrong.
Tutorial I am working off of:
http://zevross.com/blog/2015/10/14/manipulating-and-mapping-us-census-data-in-r-using-the-acs-tigris-and-leaflet-packages-3/#get-the-tabular-data-acs
And documentation of the acs package: http://eglenn.scripts.mit.edu/citystate/wp-content/uploads/2013/02/wpid-working_with_acs_R2.pdf
To follow up on Brandon's comment, version 2.1.1 of the package is now on CRAN, which should resolve this issue.
Your code runs for me. My guess would be that the Census API was temporarily down.
As you loaded tidycensus and you'd like to do some mapping, you might also consider the following code:
library(tidycensus)
census_api_key("your key here") # use `install = TRUE` to install the key
options(tigris_use_cache = TRUE) # optional - to cache the Census shapefile
median_income <- get_acs(geography = "block group",
variables = "B19013_001",
state = "CA", county = "San Diego",
geometry = TRUE)
This will get you the data you need, along with feature geometry for mapping, as a tidy data frame.
I emailed Ezra Haber Glenn, the author of the package, about this as I was having the same issue. I received a response within 30 minutes and it was after midnight, which I thought was amazing. Long story short, the acs package version 2.1.0 is configured to work with the changes the Census Bureau is making to their API later this summer, and it is currently presenting some problems windows users in the mean time. Ezra is going to be releasing an update with a fix, but in the mean time I reverted back to version 2.0 and it works fine. I'm sure there are a few ways to do this, but I installed the devtools package and ran:
require(devtools)
install_version("acs", version = "2.0", repos = "http://cran.us.r-project.org")
Hope this helps anyone else having a similar issue.

Valid client and signature using ggmap package in R

I am using ggmap package in R software in order to geocode a list of addresses. The counting is over 2500 - Googles diary quota for free users.
I have been searched about it and I found out that it is possible to geocode 100 thousand addresses per day if you have an account. I decided to open an account and I received a message that I have 60 trial days and they gave a 300 dollars credit. Despite this, I didnt discover where I can find the "client" and "signature" options that ggmap requires.
If someone can tell me where I can find it, please let me know!
My aim is to make Google searches by the address. If address is not recognized, then I make it searches for the local using the zip code. Here is what I am doing:
library(RJSONIO)
library(RCurl)
getGeoData <- function(location, api_key){
location <- gsub(' ','+',location)
geo_data <- getURL(paste("https://maps.googleapis.com/maps/api/geocode/json?address=",location,sprintf("&key=%s",api_key), sep=""))
geo_data <- fromJSON(geo_data)
if(geo_data$status=="ZERO_RESULTS"){
return(NA)
} else return(geo_data$results[[1]]$geometry$location)
}
New error:
getGeoData(basefinal$cep[6793], "MY KEY")[[2]]
Error in getGeoData(basefinal$cep[6793], "MY KEY")[[2]] :
subscript out of bounds

TwitteR (R's Twitter API)- unable to acquire tweets when specifying geocode

I'm currently using the TwitteR package in R. I'm able to acquire tweets without specifying the geocode, however when I specify the geocode, I get an error message which I give below:
#setup permission using relevant API keys in setup_twitter_oauth
searchTwitter("dog") #this line works
searchTwitter("dog", geocode = '40.714997,-73.91623, 10mi') #this line doesn't work
#returns "Error in twInterfaceObj$doAPICall(cmd, params, "GET", ...) : client error: (403) Forbidden"
The geocode I included is in New York City (I just chose a place with a high population density).
Works for me if I remove the space in the geocode argument:
searchTwitter("dog", geocode = '40.714997,-73.91623,10mi')

Resources