Creating a persistant connection to twitter stream API using R - r

I am presently using the streamR package in R to stream tweets from the filter stream in twitter. I have a handshaken ROAuth object that I use for this. My piece of code looks like:
# load the Twitter auth object
load("twitter_oAuth3.RData")
load("keywords3.RData")
streamTweet = function(){
require(streamR)
require(ROAuth)
stack = filterStream(file.name="",track=keywords,timeout=500,oauth=twitter_oAuth)
return(stack)
}
I wanted to create a real time application, which involves dumping these tweets into an activeMQ topic. My code for that is:
require(Rjms)
# Set logger properties
url = "tcp://localhost:61616"
type = "T"
name = "TwitterStream"
# initialize logger
topicWriter = initialize.logger(url,type,name)
topicWrite = function(input){
# print("writing to topic")
to.logger(topicWriter,input,asString=TRUE,propertyName='StreamerID',propertyValue='1')
return()
}
logToTopic = function(streamedStack){
# print("inside stack-writer")
stacklength = length(streamedStack)
print(c("Length: ",stacklength))
for(i in 1:stacklength){
print(c("calling for: ",i))
topicWrite(streamedStack[i])
}
return()
}
Now my problem is that of the timeout that filterStream() needs. I looked under the hood, and found this call that the function makes:
url <- "https://stream.twitter.com/1.1/statuses/filter.json"
output <- tryCatch(oauth$OAuthRequest(URL = url, params = params,
method = "POST", customHeader = NULL,
writefunction = topicWrite, cainfo = system.file("CurlSSL",
"cacert.pem", package = "RCurl")), error = function(e) e)
I tried removing the timeout component but it doesn't seem to work. Is there a way I can maintain a stream forever (until I kill it) which dumps each tweet as it comes into a topic?
P.S. I know of a java implementation that makes a call to the twitter4j API. I, however, have no idea how to do it in R.

The documentation for streamR package mentions that the default option for timeout option in filterStream() is 0 which will keep the connection open permanently.
I quote:
"numeric, maximum length of time (in seconds) of connection to stream. The
connection will be automatically closed after this period. For example, setting
timeout to 10800 will keep the connection open for 3 hours. The default is 0,
which will keep the connection open permanently."
Hope this helps.

Related

HTTPRequest roblox

i'm currently making a roblox whitelist system and it's almost finished but i need 1 thing more i scripted it and its not work (code below) i didn't found nothing to fix what i have (script and screenshoot of error below), thanks.
local key = 1
local HttpService = game:GetService("HttpService")
local r = HttpService:RequestAsync({
Url = "https://MyWebsiteUrl.com/check.php?key="..key,
Method = "GET"
})
local i = HttpService:JSONDecode(r.Body)
for n, v in pairs(i) do
print(tostring(n)..", "..tostring(v))
end
I assume the website that you are using to validate the key
returns the response in raw if so then
local key = 1
local HttpService = game:GetService("HttpService")
local r = HTTPService:GetAsync("https://MyWebsiteUrl.com/check.php?key="..key)
local response = JSON:Decode(r)
print(response)
I think this is because you tried to concat a string (the url) with a number (the key variable) try to make the key a string

API call from Call of Duty API in R - authentication problem

I am trying to call the stats of a list of players from the call of duty API. This API requires firstly the login in website https://profile.callofduty.com/cod/login. Once logged in, the user can see the stats of a player using the call-of-duty API. For example, the stats of the streamer savyultras90 from Warzone can be seen through the following link: https://my.callofduty.com/api/papi-client/stats/cod/v1/title/mw/platform/psn/gamer/savyultras90/profile/type/wz.
If I log in from the website and try to see the stats of a player and the related json, I am able to do via browser. However, this doesn't seem straightforward in R.
I try to log in using the GET function from httr package as follows
respo <- GET('https://profile.callofduty.com/cod/login', authenticate('USER', 'PWD'))
But when I try to have access to the api and download the JSON file using the function fromJSON from the package jsonlite as follows
data <- fromJSON('https://my.callofduty.com/api/papi-client/stats/cod/v1/title/mw/platform/psn/gamer/savyultras90/profile/type/wz')
I get the error message "Not permitted: not authenticated".
How can I authenticate in one website and stay logged in to call from the API which relies on that authentication?
Seeing I've recently had to develop a PHP API for Warzone, I might be able to guide you in the right direction on how to handle this. But first a few remarks:
You need to authenticate each user individually with the appropriate platform if you want to request that player's data
There is a throttle limit on the amount of API requests
The API of Call of Duty is under strict usage guidelines and should only be used by registered partners. Making usage of the API could result in claims and eventually lawsuits: link
There is no public documentation of the API and the API has changed in the past, breaking several 3rd party tools.
Nevertheless, the process involves several steps as described below:
Register the device making the call
https://profile.callofduty.com/cod/mapp/registerDevice
with a json body in the form of {"deviceId":"INSERT_ID_HERE"}
This will return a response with the authHeader which we will use for as Token in the next calls
Login with Activision credentials
https://profile.callofduty.com/cod/mapp/login
Set the following headers:
Authorization: "INSERT_AUTHHEADER_HERE"
x_cod_device_id: "INSERT_PREVIOUSLY_USED_DEVICEID_HERE"
This in terms will generate a dataset where we will save the following data from:
rtkn, ACT_SSO_COOKIE and atkn.
Make the wanted API call for data
We have all the data now required to make the API call.
For each request we will submit 3 headers:
Authorization: "INSERT_AUTHHEADER_HERE"
x_cod_device_id: "INSERT_PREVIOUSLY_USED_DEVICEID_HERE"
Cookie: ACT_SSO_LOCALE=en_GB;country=GB;API_CSRF_TOKEN=**GENERATE_CSRF_TOKEN**;rtkn=**RTKN_HERE**;ACT_SSO_COOKIE=**ACT_SSO_COOKIE_HERE**;atkn=**ATKN_HERE**
For more reference, you can always look through a Python library or NodeJS library which succesfully implemented the API.
I struggled with this yesterday but finally made some progress. The issue is that you have to obtain an authentication token. The steps can be followed here: https://documenter.getpostman.com/view/7896975/SW7aXSo5#a37a2e5b-84bb-441d-b978-0fd8d42ffd29 but not available in R though.
My code works at first, as long as you don't authenticate again (still trying to figure out why). Basically what I did was to translate the steps in the link and extracted the content in the response from GET:
# Get token ---------------------------------------------------------------
resp <- GET('https://profile.callofduty.com/cod/login')
cookies = c(
'XSRF-TOKEN' = resp$cookies$value[1]
,'new_SiteId' = resp$cookies$value[2]
,'comid' = resp$cookies$value[3]
,'bm_sz' = resp$cookies$value[4]
,'_abck' = resp$cookies$value[5]
# ,'ACT_SSO_COOKIE' = resp$cookies$value[6]
# ,'ACT_SSO_COOKIE_EXPIRY' = resp$cookies$value[7]
# ,'atkn' = resp$cookies$value[8]
# ,'ACT_SSO_REMEMBER_ME' = resp$cookies$value[9]
# ,'ACT_SSO_EVENT' = resp$cookies$value[10]
# ,'pgacct' = resp$cookies$value[11]
# ,'CRM_BLOB' = resp$cookies$value[12]
# ,'tfa_enrollment_seen' = resp$cookies$value[13]
)
headers = c(
)
params = list(
`new_SiteId` = 'cod',
`username` = 'USER',
`password` = 'PWD',
`remember_me` = 'true',
`_csrf` = resp$cookies$value[1]
)
# Authenticate ------------------------------------------------------------
resp_post <- POST('https://profile.callofduty.com/do_login?new_SiteId=cod',
httr::add_headers(.headers=headers),
query = params,
httr::set_cookies(.cookies = cookies))
cookies = c(
'XSRF-TOKEN' = resp_post$cookies$value[1]
,'new_SiteId' = resp_post$cookies$value[2]
,'comid' = resp_post$cookies$value[3]
,'bm_sz' = resp_post$cookies$value[4]
,'_abck' = resp_post$cookies$value[5]
,'ACT_SSO_COOKIE' = resp_post$cookies$value[6]
,'ACT_SSO_COOKIE_EXPIRY' = resp_post$cookies$value[7]
,'atkn' = resp_post$cookies$value[8]
,'ACT_SSO_REMEMBER_ME' = resp_post$cookies$value[9]
,'ACT_SSO_EVENT' = resp_post$cookies$value[10]
,'pgacct' = resp_post$cookies$value[11]
,'CRM_BLOB' = resp_post$cookies$value[12]
,'tfa_enrollment_seen' = resp_post$cookies$value[13]
)
headers = c(
)
params = list(
`new_SiteId` = 'cod',
`username` = 'USER',
`password` = 'PWD',
`remember_me` = 'true',
`_csrf` = resp_post$cookies$value[1]
)
# Get data:
resp_psn <- httr::GET(url = 'https://my.callofduty.com/api/papi-client/stats/cod/v1/title/mw/platform/psn/gamer/savyultras90/profile/type/wz',
httr::add_headers(.headers=headers),
query = params,
httr::set_cookies(.cookies = cookies))
resp_psn_json <- content(resp_psn)
Let me know if you've already managed to resolve this!

failed to authenticate google translate in R

So I tried to use gl_translate function to 500,000 characters in Rstudio, which means I have to authenticate my google translate API. The problem is that I tried it like two months ago with my old google account and now I'm using the new one.
So when I tried to authenticate new client_id with my new google account, I got error message that my API hadn't been enabled yet, which I had enabled it. I restarted my Rstudio and now I got this error message:
020-01-22 19:01:24 -- Translating html: 147 characters -
2020-01-22 19:01:24> Request Status Code: 403
Error: API returned: Request had insufficient authentication scopes.
It is very frustrating because then I tried to enable the old google account and it requires me to put my credit card number, which is again I did and then now they asked me to wait several days.
Anyone can figure out what's problem with this?
here is my R code for authentication:
install.packages("googleAnalyticsR", dependencies = TRUE)
library(googleAnalyticsR)
install.packages("googleLanguageR")
library(googleLanguageR)
install.packages("dplyr")
library(dplyr)
library(tidyverse)
install.packages("googleAuthR")
library(googleAuthR)
client_id <- "107033903887214478396"
private_key <- "-----BEGIN PRIVATE KEY-----\nMIIEvAIBADANBgkqhkiG9w0BAQEFAASCBKYwggSiAgEAAoIBAQChPmvib1v9/CFA\nX7fG8b8iXyS370ivkufMmX30C6/rUNOttA+zMhamS/EO0uaYtPw44B4QdNzRsSGq\nm+0fQ5Sp1SHJaVPkoImuUXZdMlLO73pvY48nMmEFg7deoOZI+CNWZYgIvPY8whMo\nk4vKE2iyuG+pl9MT7dT6dwWNmXDWr8kgxAfryfVEUIeqaf+57Z9g3FfVPLARz4iH\nCaTu55hhbmo/XknUx0hPsrwBMPaNLGl2+o5MU1ZuIkl47EJvdL8CdUuEmb9qJDtv\nUyKqANlwFa7cVW8ij2tjFpjJ7bigRVJsI8odbsEbwmx1b5SLckDOQ7t4l8zERtmo\nUv5fxyNNAgMBAAECggEAApeLyWyL2IXcjPnc7OxG68kGwJQuoW/lnQLcpPcpIUm/\n1Vt/IxzLg2nWGqxmO48xPMLRiOcwA4jq5yCxi56c/avo6qFwUU0JWY2CrxXXge8U\nk0TQ8MrdB2cqI/HHMeYXP1TLfoR3GtvtzemtRhbQyIqxdNL1eC0LDump47BTQYg0\nVPuCxU3zXVIj+Qt0FZa+Pa/nAOGHf5b4ye56L7vxL2BCeRncuHdDcE6Ilkpz79Gv\nkXP1K5j22uEVCmobe1qRlq3BLx2Qimj4h8MI8CKiNS40tGR/oewJ5uMgmeCePYKv\nqSFOwCDvRkw9V2KdGu40WQFEq21mczlv9gMWhp2/EQKBgQDRmBZZM7ugIpo64wx6\nDFYhZo05LmMiwymIfWs2CibzKDeXPuy3OSytvTPVFCkG+RlcYthxAIAn1Z/qJ4UI\n+8c8Zwfg+toYtEa2gTYM2185vmnqQwqmAsaK+4xKZzgfqxie/CBuPzUOZO41q6P8\ni7A2KqXHcDb4SMqnkdGGLk/7+QKBgQDE8dBesgx4DsHFYg1sJyIqKO4d2pnLPkDS\nAzx5xvQuUcVCNTbugOC7e0vGxWmQ/Eqal5b3nitH590m8WHnU9UiE4HciVLe+JDe\nDe5CWzBslnncBjpgiDudeeEubhO7MHv/qZyZXMh73H2NBdO8j0uiGTNbBFoOSTYq\nsFACiCZu9QKBgE2KjMoXn5SQ+KpMkbMdmUfmHt1G0hpsRZNfgyiM/Pf8qwRjnUPz\n/RmR4/ky6jLQOZe6YgT8gG08VVtVn5xBOeaY34tWgxWcrIScrRh4mHROg/TNNMVS\nRY3pnm9wXI0qyYMYGA9xhvl6Ub69b3/hViHUCV0NoOieVYtFIVUZETJRAoGAW/Y2\nQCGPpPfvD0Xr0parY1hdZ99NdRQKnIYaVRrLpl1UaMgEcHYJekHmblh8JNFJ3Mnw\nGovm1dq075xDBQumOBU3zEzrP2Z97tI+cQm3oNza5hyaYbz7aVsiBNYtrHjFTepb\nT1l93ChnD9SqvB+FR5nQ2y07B/SzsFdH5QbCO4kCgYBEdRFzRLvjdnUcxoXRcUpf\nfVMZ6fnRYeV1+apRSiaEDHCO5dyQP8vnW4ewISnAKdjKv/AtaMdzJ5L3asGRWDKU\n1kP/KDBlJkOsOvTkmJ4TxbIhgcSI62/wqDBi5Xqw1ljR2mh8njzRwqDRKs12EtQ0\n9VaUDm7LCNTAskn2SR/o4Q==\n-----END PRIVATE KEY-----\n"
options(googleAuthR.client_id = client_id)
options(googleAuthR.client_secret = private_key)
devtools::reload(pkg = devtools::inst("googleAnalyticsR"))
ga_auth()
in case you need to see what's my translate code like:
translate <- function(tibble) {
tibble <- tibble
count <- data.frame(nchar = 0, cumsum = 0) # create count file to stay within API limits
for (i in 1:nrow(tibble)) {
des <- pull(tibble[i,2]) # extract description as single character string
if (count$cumsum[nrow(count)] >= 80000) { # API limit check
print("nearing 100000 character per 100 seconds limit, pausing for 100 seconds")
Sys.sleep(100)
count <- count[1,] # reset count file
}
if (grepl("^\\s*$", des) == TRUE) { # if description is only whitespace then skip
trns <- tibble(translatedText = "", detectedSourceLanguage = "", text = "")
} else { # else request translation from API
trns <- gl_translate(des, target='en', format='html') # request in html format to anticipate html descriptions
}
tibble[i,3:4] <- trns[,1:2] # add to tibble
nchar = nchar(pull(tibble[i,2])) # count number of characters
req <- data.frame(nchar = nchar, cumsum = nchar + sum(count$nchar))
count <- rbind(count, req) # add to count file
if (nchar > 20000) { # addtional API request limit safeguard for large descriptions
print("large description (>20,000), pausing to manage API limit")
Sys.sleep(100)
count <- count[1,] # reset count file
}
}
return(tibble)
}
I figured it out after 24 hours.
Apparently it is really easy. I just followed the step from this link.
But yesterday I make mistake because the json file I downloaded is the json file from service client Id, while I actually need the json file from service account.
Then I install the googleLanguageR package with this code:
remotes::install_github("ropensci/googleLanguageR")
library(googleLanguageR)
and then just set the file location of my download Google Project JSON file in a GL_AUTH argument like this:
gl_auth("G:/My Drive/0. Thesis/R-Script/ZakiServiceAccou***************kjadjib****.json")
and now I'm happy :)

Pause/Stop AjaxDataSource Bokeh Stream

I am able to set up my graph for streaming just fine. Here's the initialization:
self.data_source = AjaxDataSource(data_url='my_route',
polling_interval=1000, mode='append', max_size=300)
Now I want to 'pause' the polling of the AjaxDataSource. I couldn't find a way to do this in the documentation. I'm NOT running a bokeh server, so bokeh server solutions I am unable to use.
I came up with one possible solution: just return empty data set to the function that is appending the data via AjaxDataSource. So in the example above, the my_route function would look something like this:
def my_route:
if not self.is_paused:
data = normal_data_to_graph
else:
data = []
return data
Once you set the polling_interval = None in Python, it will not request it. In CustomJS, you can start the paused request. Here, the source is an AJaxDataSource instance.
source.polling_interval = 1000; // the interval you want
source.intialized = false;
source.setup();

Posting to and Receiving data from API using httr in R

I'm trying to access the US Census Geocoder batch address API, found here: http://geocoding.geo.census.gov/geocoder/
I've also gone through the documentation for the API here: http://geocoding.geo.census.gov/geocoder/Geocoding_Services_API.pdf
I'm trying to use the httr package in R to post a batch csv file of formatted addresses using this format: Unique ID, Street address, City, State, ZIP
I've tried the single address request version using getURL from RCurl and that works fine, but postForm doesn't seem to submit the file in the right way. The code I'm using right now seems to post the request correctly, but I'm not getting any geocoded data back.
curlhandle <- handle("http://geocoding.geo.census.gov/geocoder/geographies/addressbatch",
cookies = TRUE)
# using fileUpload from RCurl instead of upload_file from httr
upload3 <- fileUpload(contents = address100, contentType = "file")
test <- POST(url="http://geocoding.geo.census.gov/geocoder/geographies/addressbatch",
body = list(addressFile = upload3,
benchmark = "Public_AR_Census2010",
vintage="Census2010_Census2010"),
encode = "multipart",
handle = curlhandle,
followLocation = TRUE,
verbose = TRUE)
Is my request missing something? I'm not sure if I should be using writefunction & writedata in this case. Any help would be appreciated!
You seem to have a weird mix of RCurl and httr. Here's how I'd write the request:
req <- POST(
"http://geocoding.geo.census.gov/geocoder/geographies/addressbatch",
body = list(
addressFile = upload_file(address100),
benchmark = "Public_AR_Census2010",
vintage = "Census2010_Census2010"
),
encode = "multipart",
verbose())
stop_for_status(req)
content(req)
(also "file" is not a valid mime type)

Resources