How can I avoid Twitter API rate limits when returning a user's followers? - r

I have seen a few posts about this, but I have not been able to use any of the suggested code for the way I have my program set up. I am not very good at R but I need this information for a Political Science class. Could someone please help me figure out a way to avoid receiving the error message "Warning message: In twInterfaceObj$doAPICall(cmd, params, method, ...) : Rate limit encountered & retry limit reached - returning partial results." Thank you in advance! The code I am running is below:
# install and load packages
install.packages("twitteR")
install.packages("ROAuth")
install.packages("base64enc")
install.packages("plyr")
library("twitteR")
library("ROAuth")
library("base64enc")
library("plyr")
# Establish the connection (do this once)
cred$handshake(cainfo="cacert.pem")
#setup Twitter
setup_twitter_oauth('Consumer Key - XXXXXXXXXX', Consumer Secret -
'XXXXXXXXXX', API Key - 'XXXXXXXXXX', API Secret - 'XXXXXXX')
#store the user
user <- getUser('JackPosobiec')
#return a list of followers
Posobiecfollowers <- user$getFollowers(n=NULL)
Posobiecfollowers
#save followers in a data set
followers.df = ldply(followers, function(t) t$toDataFrame())
write.csv(followers.df, file = "JackPosobiecfollowersfile.csv")

Related

academictwitteR - Error in make_query(url = endpoint_url, params = params, bearer_token = bearer_token, : something went wrong. Status code: 403

I have just received an Academic Twitter Developer privileges and am attempting to scrape some tweets. I updated RStudio, regenerated a new bearer token once I got updated Academic Twitter access, and get_bearer() returns my new bearer token. However, I continue to get the following error:
Error in make_query(url = endpoint_url, params = params, bearer_token = bearer_token, :
something went wrong. Status code: 403
In addition: Warning messages:
1: Recommended to specify a data path in order to mitigate data loss when ingesting large amounts of data.
2: Tweets will not be stored as JSONs or as a .rds file and will only be available in local memory if assigned to an object.
Additionally, I have tried specifying a data path, but I think I am confused as to what these means? I think that's where my issue lies, but does the data path way mean like a specific file pathway on my computer?
Below is the code I was attempting to use.This code worked previously with my professor's bearer token that they used to just show the output:
`tweets <-
get_all_tweets(
query = "#BlackLivesMatter",
start_tweets = "2020-01-01T00:00:00Z",
end_tweets = "2020-01-05T00:00:00Z",
n = 100)`
Thanks in advance!
Status code 403 means forbidden. You may want to check the error codes reference page of Twitter API here.
Perhaps your bearer token is misspelled?

jsonlite::fromJSON failed to connect to Port 443 in quantmod getFX()

I needed a function which automatically gets the Exchange Rate from a website (oanda.com), therefore I used the quantmod package and made a function based on that.
library(quantmod)
ForeignCurrency<-"MYR" # has to be a character
getExchangeRate<-function(ForeignCurrency){
Conv<-paste("EUR/",(ForeignCurrency),sep="")
getFX(Conv,from=Sys.Date()-179,to=Sys.Date())
Conv2<-paste0("EUR",ForeignCurrency,sep="")
Table<-as.data.frame(get(Conv2))
ExchangeRate<-1/mean(Table[,1])
ExchangeRate
}
ExchangeRate<-getExchangeRate(ForeignCurrency)
ExchangeRate
On my personal PC, it works perfectly and do what I want. If i run this on the working PC, I get following Error:
Warning: Unable to import “EUR/MYR”.
Failed to connect to www.oanda.com port 443: Timed out
I googled already a lot, it seems to be a Firewall Problem, but none of the suggestions I found there doesnt work. After checking the getFX() function, the Problem seems to be in the jsonlite::fromJSON function which getFX() is using.
Did someone of you faced a similar Problem? I am quite familar with R, but with Firewalls/Ports I have no expertise. Do I have to change something in the R settings or is it a Problem independent of R and something in the Proxy settings needs to be changed?
Can you please help :-) ?
The code below shows how you can do a workaround for the getfx() in an enterprise context where you often have to go through a proxy to the internet.
library(httr)
# url that you can find inside https://github.com/joshuaulrich/quantmod/blob/master/R/getSymbols.R
url <- "https://www.oanda.com/fx-for-business//historical-rates/api/data/update/?&source=OANDA&adjustment=0&base_currency=EUR&start_date=2022-02-17&end_date=2022-02-17&period=daily&price=mid&view=table&quote_currency_0=VND"
# original call inside the quantmod library: # Fetch data (jsonlite::fromJSON will handle connection) tbl <- jsonlite::fromJSON(oanda.URL, simplifyVector = FALSE)
# add the use_proxy with your proxy address and proxy port to get through the proxy
response <- httr::GET(url, use_proxy("XX.XX.XX.XX",XXXX))
status <- status_code(response)
if(status == 200){
content <- httr::content(response)
# use jsonlite to get the single attributes like quote currency, exchange rate = average and base currency
exportJson <- jsonlite::toJSON(content, auto_unbox = T)
getJsonObject <- jsonlite::fromJSON(exportJson, flatten = FALSE)
print(getJsonObject$widget$quoteCurrency)
print(getJsonObject$widget$average)
print(getJsonObject$widget$baseCurrency)
}

query and Request issues with gtrendsr

I have successfully logged in google account with
gconnect(usr,password)
But I got request problem when I want to query the data
gt.us <- gtrends("USA", geo="US", start_date="2004-01-01", end_date="2004-01-30")
Error: Not enough search volume. Please change your search terms.
In addition: Warning message:
In request_GET(x, url, ...) : Gone (HTTP 410).
Could anyone help me out ?
Get gtrendsR 2.0.0:
devtools::install_github('PMassicotte/gtrendsR')
library(gtrendsR)
# do not need to log in anymore
# syntax change!
gt.us <- gtrends("USA", geo="US", time = "2004-01-01 2004-01-30")
See Philippe Massicotte answer to this issue.

Valid client and signature using ggmap package in R

I am using ggmap package in R software in order to geocode a list of addresses. The counting is over 2500 - Googles diary quota for free users.
I have been searched about it and I found out that it is possible to geocode 100 thousand addresses per day if you have an account. I decided to open an account and I received a message that I have 60 trial days and they gave a 300 dollars credit. Despite this, I didnt discover where I can find the "client" and "signature" options that ggmap requires.
If someone can tell me where I can find it, please let me know!
My aim is to make Google searches by the address. If address is not recognized, then I make it searches for the local using the zip code. Here is what I am doing:
library(RJSONIO)
library(RCurl)
getGeoData <- function(location, api_key){
location <- gsub(' ','+',location)
geo_data <- getURL(paste("https://maps.googleapis.com/maps/api/geocode/json?address=",location,sprintf("&key=%s",api_key), sep=""))
geo_data <- fromJSON(geo_data)
if(geo_data$status=="ZERO_RESULTS"){
return(NA)
} else return(geo_data$results[[1]]$geometry$location)
}
New error:
getGeoData(basefinal$cep[6793], "MY KEY")[[2]]
Error in getGeoData(basefinal$cep[6793], "MY KEY")[[2]] :
subscript out of bounds

R connecting R to twitter for sentiment analysis

I refered to the link given below for doing sentiment analysis
http://heuristically.wordpress.com/2011/04/08/text-data-mining-twitter-r/
And when I ran the code that is given below :`for (page in c(1:15)){
# search parameter
twitter_q <- URLencode('#prolife OR #prochoice')
twitter_url =
# fetch remote URL and parse
mydata.xml <- xmlParseDoc(twitter_url, asText=F)
# extract the titles
mydata.vector <- xpathSApply(mydata.xml, '//s:entry/s:title', xmlValue, namespaces =c('s'='http://www.w3.org/2005/Atom'))
# aggregate new tweets with previous tweets
mydata.vectors <- c(mydata.vector, mydata.vectors)
}
After running the code it is prompting me for an error
Error:Error in UseMethod("xpathApply") :
no applicable method for 'xpathApply' applied to an object of class "NULL"
I/O warning : failed to load HTTP resource
I installed the packages Roath,stringr,XML,plyr which was required.And I am using R Ver 3.0.3
Kindly help me out pleas how to go about it . I am struggling for this . It would be a great help if anyone guides me properly in right direction.

Resources