How to change Google Accounts using library(googlesheets)? - r

I experience some problems. I use a R-Script on one machine successfully. The same script used on a different computer causes problems:
# Here I register the sheet
browser <- gs_title("Funnel Daily")
browser<-gs_edit_cells(ws="Classic Browser", browser, input = ClassicBrowser, anchor = "A1",byrow = FALSE,
col_names = NULL, trim = F,verbose = TRUE)
Auto-refreshing stale OAuth token.
Error in gs_lookup(., "sheet_title", verbose) :
"Funnel Daily" doesn't match sheet_title of any sheet returned by gs_ls() (which should reflect user's Google Sheets home screen).
> browser <- gs_title("Funnel Daily")
Error in gs_lookup(., "sheet_title", verbose) :
"Funnel Daily" doesn't match sheet_title of any sheet returned by gs_ls() (which should reflect user's Google Sheets home screen).`
if using gl_ls() I get a message about an google account which I also frequently use. So is there a way maybe via a token or so to differentiate between accounts or how can I solve that issue. I mean how can I force googlesheets to access some specific account?
Currently I'm using the tokenof the account which corresponds to Funnel Daily. The only possibility I can think of which may have caused the issue is that the browser-authentification was done with the account which not included Funnel Daily..I just confused them.
I tried to remove googlesheets as well as httrwith all dependencies. But when running the library(googlesheets) and asking gs_user googlesheets always refers to the account which doesnt include the specific sheet.

Include your credentials and confirm the browser authentication via your Funnel Daily Google Account:
options(googlesheets.client_id = "",
googlesheets.client_secret = "",
googlesheets.httr_oauth_cache = FALSE)
gs_auth(token = NULL, new_user = FALSE,
key = getOption("googlesheets.client_id"),
secret = getOption("googlesheets.client_secret"),
cache = getOption("googlesheets.httr_oauth_cache"), verbose = TRUE)
Cheers

Related

academictwitteR get_user_timeline Error 400 API v2

I am trying to get the full timeline of twitter user through the academictwitteR package and the twitter academic track API (v2). However, I get an error 400. Unfortunately, the explanation here https://developer.twitter.com/en/support/twitter-api/error-troubleshooting
says "The request was invalid or cannot be otherwise served. An accompanying error message will explain further. Requests without authentication or with invalid query parameters are considered invalid and will yield this response.", but the error code does not contain any explanation. I do not know what I am doing wrong. I tried other R packages as well, same error. I have access to all other functions of the API, but cannot adjust the timeline to download either more than 3200 tweets at a time (as my access should allow me to do). I want to full timeline of this user, not just the newest 3200 tweets.
The errorcode looks like that:
Error in make_query(url = endpoint_url, params = params, bearer_token = bearer_token, :
something went wrong. Status code: 400
This is my request (I tried using the user ID, it does not change anything).
tmlne <- get_user_timeline("25390350",
start_tweets = "2009-03-19T00.00.00Z",
end_tweets = "2021-11-27T00.00.00Z",
bearer_token = get_bearer(),
data_path = "twitter/data/timelines/",
n = 100000,
bind_tweets = F,
file = "tmlnw")
I tried revoking and renewing the bearer token, I tested the token by downloading all tweets from a user (which is not what I am after, but just to see if my access works), that works, but the timelines do not. I can get 3200 with the get_timelines_v2 function of the TwitterV2 package, but I cannot circumvent the 3200 tweets limit and I do not know how to change the request to get older tweets than the most recent 3200, and I cannot get the academictwitteR package to use the timeline function (here I know how to make multiple requests).
What would help me is either an information regarding the Error 400 or how to adjust the get_timelines_v2 function to include older tweets.

Gmapsdistance package in R error "You must use an API key to authenticate each request to Google Maps Platform APIs."

Have an R script to run a bunch of different points for commute time and distance.
Have a google API key with an enabled billing account, go to run and get shut down at line 262 every time.
Tried restructuring the code.
Refreshed the API
Set the API key in different parts
Code works perfect up to that point, and it is making a connection with google as it is showing in the API.
emp_commute$CommuteTime[i] <- gmapsdistance(origin = emp_commute$HomeComplete[i],
destination = emp_commute$WorkComplete[i],
mode = "driving",
key = "",
arr_date = "2019-11-13",
arr_time = emp_commute$ArrivalTime[i])$Time[1]
Error in gmapsdistance(origin = emp_commute$HomeComplete[i],
destination = emp_commute$WorkComplete[i], : Google API returned
an error: You must use an API key to authenticate each request to
Google Maps Platform APIs. For additional information, please refer to
http://g.co/dev/maps-no-account
Googled and googled, just would love some advice!
This package works without problem for me. Used RStudio Cloud. Here's what I did.
First I installed gmapsdistance:
install.packages("gmapsdistance")
Then I ran the following code with my API key (set in the key= parameter):
library("gmapsdistance")
origin <- c("40.431478+-80.0505401", "33.7678359+-84.4906438")
destination <- c("43.0995629+-79.0437609", "41.7096483+-86.9093986")
results <- gmapsdistance(origin, destination, mode="driving", key="abcd", arr_date="2019-11-13", arr_time="09:00:00")
results
The response was:
$Time
or Time.43.0995629+-79.0437609 Time.41.7096483+-86.9093986
1 40.431478+-80.0505401 13878 23071
2 33.7678359+-84.4906438 49402 38351
[...]
Using set.api.key("") also worked fine. For testing purposes I recommend that you try my exact steps and code above. Also double check the following:
Billing and the Distance Matrix API are enabled on your project (go through Google's guide)
Your values for origin, destination and arr_time are valid (try hard-coding them first)
Hope this helps.

Rcrawler - How to crawl account/password protected sites?

I am trying to crawl and scrape a website's tables. I have an account with the website, and I found out that Rcrawl could help me with getting parts of the table based on specific keywords, etc. The problem is that on the GitHub page there is no mentioning of how to crawl a site with account/password protection.
An example for signing in would be below:
login <- list(username="username", password="password",)
Do you have any idea if Rcrawler has this functionality? For example something like:
Rcrawler(Website = "http://www.glofile.com" +
list (username = "username", password = "password" + no_cores = 4, no_conn = 4, ExtractCSSPat = c(".entry-title",".entry-content"), PatternsNames = c("Title","Content"))
I'm confident my code above is wrong, but I hope it gives you an idea of what I want to do.
To crawl or scrape password-protected websites in R, more precisely HTML-based Authentication, you need to use web driver to stimulate a login session, Fortunately, this is possible since Rcrawler v0.1.9, which implement phantomjs web driver ( a browser but without graphics interface).
In the following example will try to log in a blog website
library(Rcrawler)
Dowload and install web driver
install_browser()
Run the browser session
br<- run_browser()
If you get an error than disable your antivirus or allow the program in your system setting
Run an automated login action and return a logged-in session if successful
br<-LoginSession(Browser = br, LoginURL = 'http://glofile.com/wp-login.php'
LoginCredentials = c('demo','rc#pass#r'),
cssLoginFields =c('#user_login', '#user_pass'),
cssLoginButton ='#wp-submit' )
Finally, if you know already the private pages you want to scrape/download use
DATA <- ContentScraper(... , browser =br)
Or, simply crawl/scrape/download all pages
Rcrawler(Website = "http://glofile.com/",no_cores = 1 ,no_conn = 1,LoggedSession = br ,...)
Don't use multiple parallel no_cores/no_conn as many websites reject multiple sessions by one user.
Stay legit and honor robots.txt by setting Obeyrobots = TRUE
You access the browser functions, like :
br$session$getUrl()
br$session$getTitle()
br$session$takeScreenshot(file = "image.png")

Rfacebook package: "Unsuported get request" for the getPage method

I am trying to extract all the posts from this year from a Facebook page using the Rfacebook package.
However, for the page that I need I get this error:
"Error in callAPI(url = url, token = token) :
Unsupported get request. Object with ID 'XXXXX' does not exist,
cannot be loaded due to missing permissions, or does not support this operation.
Please read the Graph API documentation at https://developers.facebook.com/docs/graph-api"
This is the command that I used:
datafb <- getPage('XXXXX', token, n = 1000, since = '2017/01/01', until = '2017/04/01',
feed = TRUE)
I am sure the page exists, because I can access it from my Facebook account.
Also, the token is valid because it works when I try for other pages.
I really can't see what's wrong. Does anyone have any idea?

How can I allow new R users to send information to a Google Form?

How can I allow new R users to send information to a Google Form? (RSelenium requires a bit of set up, at least for headless browsing, so it's not the best candidate IMO but I may be missing something that makes it the best choice).
I have some new R users I want to get responses from interactively and send to a secure location. I have chosen Google Forms to pass the information to, as it allows one way sends of the info and doesn't allow the user access to the spreadsheet that is created from the form.
Here's a url of this form:
url <- "https://docs.google.com/forms/d/1tz2RPftOLRCQrGSvgJTRELrd9sdIrSZ_kxfoFdHiqD4/viewform"
To give context here's how I'm using R to interact with the user:
question <- function(message, opts = c("Yes", "No")){
message(message)
ans <- menu(opts)
if (ans == "2") FALSE else TRUE
}
question("Was this information helpful?")
I want to then send that TRUE/FALSE to the Google form above. How can I send a response to the Google Form above from within R in a way that I can embed in code the user will interact with and doesn't require difficult set up by the user?
Add on R packages are fine if they accomplish the task.
You can send a POST query. Here an example using httr package:
For example:
library(httr)
send_response<-
function(response){
form_url <- "https://docs.google.com/forms/d/1tz2RPftOLRCQrGSvgJTRELrd9sdIrSZ_kxfoFdHiqD4/formResponse"
POST(form_url,
query = list(`entry.1651773982`=response)
)
}
Then you can call it :
send_response(question("Was this information helpful?"))

Resources