I've been trying to authenticate with Reddit from R using RCurl based on this example from Reddit's github:
curl -X POST -d 'grant_type=password&username=reddit_bot&password=snoo' --user 'p-jcoLKBynTLew:gko_LXELoV07ZBNUXrvWZfzE3aI' https://ssl.reddit.com/api/v1/access_token
I've tried to convert it to an RCurl command like so:
postForm("https://ssl.reddit.com/api/v1/access_token?grant_type=password",
username = "MyUserName",
password = "MyPassword",
.opts = list(userpwd = "MyClientid:MySecret")
)
But I get an error: Error: Unauthorized
I'm not sure what I'm doing really with the conversion of the curl command to Rcurl. Thanks for any help you could provide!
Try this httr code:
library(httr)
POST("https://ssl.reddit.com/api/v1/access_token",
body = list(
grant_type = "password",
username = "MyUserName",
password = "MyPassword"
),
encode = "form",
authenticate("p-jcoLKBynTLew", "gko_LXELoV07ZBNUXrvWZfzE3aI")
)
Related
I'm trying to get some appointment data from a practice management software. I have an API key but I have no experience in the area.
I have tried to convert Curl code with little success. The api documentation is here https://github.com/redguava/cliniko-api
I am trying to convert this curl code
curl https://api.cliniko.com/v1/appointments \
-u API_KEY: \
-H 'Accept: application/json' \
-H 'User-Agent: APP_VENDOR_NAME (APP_VENDOR_EMAIL)'
What I've tried: (yes this is from a curl to r converter)
require(httr)
headers = c(
`Accept` = 'application/json',
`User-Agent` = 'APP_VENDOR_NAME (APP_VENDOR_EMAIL)'
)
res <- httr::GET(url = 'https://api.cliniko.com/v1/appointments',
httr::add_headers(.headers=headers),
httr::authenticate('API_KEY', 'INSERTED MY API KEY'))
Any ideas would be greatly appreciated
httr::authenticate takes input username and password in the form httr::authenticate(username,password).
Curl's authenticate takes argument username and password joined by by a :, i.e. username:password.
In the example from the API documentation the curl command authenticates the username:password combination API_KEY:. Looking closely, we can see that after the : is blank. From this we can determine the username field should be 'API_KEY' and the password field should be ''.
So you should change your curl command to:
require(httr)
headers = c(
`Accept` = 'application/json',
`User-Agent` = 'APP_VENDOR_NAME (APP_VENDOR_EMAIL)'
)
res <- httr::GET(url = 'https://api.cliniko.com/v1/appointments',
httr::add_headers(.headers=headers),
httr::authenticate('API_KEY', ''))
Where API_KEY is your provided API key.
I'm trying to make the following request :
library("RCurl")
getURL("https://api.example.com/resource",
userpwd ="username:password",param="ApplicationID")
But I have this error :
The required parameter 'Username' was not found.
Any ideas?
Try this:
opts = curlOptions(userpwd = "user:swd", param= "AppID")
getURL("url",.opts=opts)
I am trying to log in to a website("dkurl") below and then download a zip file("url") below. Following other answers using RCURL, I have attempted to use the code below, however I cannot get the file downloaded. Are there other parameters or commands I am missing?
url <- 'http://www.draftkings.com/contest/exportfullstandingscsv/40827113'
dkurl <- 'https://www.draftkings.com/account/sitelogin/'
pars = list(username = xxx, password = xxx)
agent = "Mozilla/5.0"
curl = getCurlHandle()
curlSetOpt(cookiejar="", useragent = agent, followlocation = TRUE, curl=curl)
html=postForm(dkurl, .params=pars, curl=curl)
html=getURL(url, curl=curl)
It's quite convenient to download files using httr package. Just like this.
library(httr)
GET(fileUrl, authenticate(user, password),
write_disk(filename), timeout(60))
I am trying to write a code that will allow me to download a .xls file from a secured https website which requires a login. This is very difficult for me, as i have no experience with web-coding--all my R experience comes from econometric work with readily available datasets.
i followed this thread to help write some code, but i think im running into trouble because the example is http, and i need https.
this is my code:
install.packages("RCurl")
library(RCurl)
curl = getCurlHandle()
curlSetOpt(cookiejar = 'cookies.txt', followlocation = TRUE, autoreferer = TRUE, curl = curl)
html <- getURL('https://jump.valueline.com/login.aspx', curl = curl)
viewstate <- as.character(sub('.*id="_VIEWSTATE" value="([0-9a-zA-Z+/=]*).*', '\\1', html))
params <- list(
'ct100$ContentPlaceHolder$LoginControl$txtUserID' = 'MY USERNAME',
'ct100$ContentPlaceHolder$LoginControl$txtUserPw' = 'MY PASSWORD',
'ct100$ContentPlaceHolder$LoginControl$btnLogin' = 'Sign In',
'_VIEWSTATE' = viewstate)
html <- postForm('https://jump.valueline.com/login.aspx', .params = params, curl = curl)
when i get to running the piece that starts "html <- getURL(..." i get:
> html <- getURL('https://jump.valueline.com/login.aspx', curl = curl)
Error in function (type, msg, asError = TRUE) :
SSL certificate problem: unable to get local issuer certificate
is there a workaround for this? how am i able to access the local issuer certificate?
I read that adding '.opts = list(ssl.verifypeer = FALSE)' into the curlSetOpt would remedy this, but when i add that, the getURL runs, but then postForm line gives me
> html <- postForm('https://jump.valueline.com/login.aspx', .params = params, curl = curl)
Error: Internal Server Error
Besides that, does this code look like it will work given the website i am trying to access? I went into the inspector, and changed all the params to be correct for my webpage, but since i'm not well versed in webcoding i'm not 100% i caught the right things (particularly the VIEWSTATE). Also, is there a better, more efficient way i could approach this?
automating this process would be huge for me, so your help is greatly appreciated.
Try httr:
library(httr)
html <- content(GET('https://jump.valueline.com/login.aspx'), "text")
viewstate <- as.character(sub('.*id="_VIEWSTATE" value="([0-9a-zA-Z+/=]*).*', '\\1', html))
params <- list(
'ct100$ContentPlaceHolder$LoginControl$txtUserID' = 'MY USERNAME',
'ct100$ContentPlaceHolder$LoginControl$txtUserPw' = 'MY PASSWORD',
'ct100$ContentPlaceHolder$LoginControl$btnLogin' = 'Sign In',
'_VIEWSTATE' = viewstate
)
POST('https://jump.valueline.com/login.aspx', body = params)
That still gives me a server error, but that's probably because you're not sending the right fields in the body.
html <- getURL('https://jump.valueline.com/login.aspx', curl = curl, ssl.verifypeer = FALSE)
This should work for you. The error you're getting is probably because libcurl doesn't know where to look for to get a certificate for SSL.
I am new to using R to post forms and then download data off the web. I have a question that is probably very easy for someone out there to spot what I am doing wrong, so I appreciate your patience. I have a Win7 PC and Firefox 23.x is my typical browser.
I am trying to post the main form that shows up on
http://www.aplia.com/
I have the following R script:
your.username <- 'username'
your.password <- 'password'
setwd( "C:/Users/Desktop/Aplia/data" )
require(SAScii)
require(RCurl)
require(XML)
agent="Firefox/23.0"
options(RCurlOptions = list(cainfo = system.file("CurlSSL", "cacert.pem", package = "RCurl")))
curl = getCurlHandle()
curlSetOpt(
cookiejar = 'cookies.txt' ,
useragent = agent,
followlocation = TRUE ,
autoreferer = TRUE ,
curl = curl
)
# list parameters to pass to the website (pulled from the source html)
params <-
list(
'userAgent' = agent,
'screenWidth' = "",
'screenHeight' = "",
'flashMajor' = "",
'flashMinor' = "",
'flashBuild' = "",
'flashPatch' = "",
'redirect' = "",
'referrer' = "http://www.aplia.com",
'txtEmail' = your.username,
'txtPassword' = your.password
)
# logs into the form
html = postForm('https://courses.aplia.com/', .params = params, curl = curl)
html
# download a file once form is posted
html <-
getURL(
"http://courses.aplia.com/af/servlet/mngstudents?ctx=filename" ,
curl = curl
)
html
But from there I can tell that I am not getting the page I want, as what is returned into html is a redirect message that appears to be asking me to login again (?):
"\r\n\r\n<html>\r\n<head>\r\n <title>Aplia</title>\r\n\t<script language=\"JavaScript\" type=\"text/javascript\">\r\n\r\n top.location.href = \"https://courses.aplia.com/af/servlet/login?action=form&redirect=%2Fservlet%2Fmngstudents%3Fctx%3Dfilename\";\r\n \r\n\t</script>\r\n</head>\r\n<body>\r\n Click here to continue.\r\n</body>\r\n</html>\r\n"
Although I do believe there are a series of redirects that occur once the form is posted successfully (manually, in a browser). How can I tell the form was posted correctly?
I am quite sure that once I can get the post working correctly, I won't have a problem directing R to download the files I need (online activity reports for each of my 500 students this semester). But spent several hours working on this and got stuck. Maybe I need to set more options with the RCurl package that have to do with cookies (as the site does use cookies) ---?
Any help so much appreciated!! I typically use R to handle statistical data so am new to these packages and functions.
The answer ends up being very simple. For some reason, I didn't see one option that needs to be included in postForm:
html = postForm('https://courses.aplia.com/', .params = params, curl = curl, style="POST")
And that's it...