I can access API data using "curl" on terminal but get error when i use R. Looking for some advice
Curl command:
curl -H "token:jQyrbzexeCEaWFIDBAwCWqbrkrVQTVhM" "https://www.ncdc.noaa.gov/cdo-web/api/v2/datasets"
R command:
books_key <- "&token=jQyrbzexeCEaWFIDBAwCWqbrkrVQTVhM"
url <- "https://www.ncdc.noaa.gov/cdo-web/api/v2/datasets"
req <- fromJSON(paste0(url, books_key))
Error in open.connection(con, "rb") : HTTP error 400.
Usually this error comes when there are spaces in URL (from other similar questions) but in my case there is no space in URL
Some info on using token from website https://www.ncdc.noaa.gov/cdo-web/webservices/v2#gettingStarted
Not a R issue because following example of another website works
movie_key <- "&api-key=b75da00e12d54774a2d362adddcc9bef"
url <- "http://api.nytimes.com/svc/movies/v2/reviews/dvd-picks.json?order=by-date"
req <- fromJSON(paste0(url, movie_key))
You're supposed to pass the token as a header not a query param, e.g. with the crul pkg
cli <- crul::HttpClient$new(
url = "https://www.ncdc.noaa.gov",
headers = list(token = "yourtoken"))
cli$get(path = "cdo-web/api/v2/datasets")
A quick look at your Curl and your R call - in Curl your passing the token as a header variable and in the R call it appears to be a query variable.
Also the NYT call has a query param in the URI so & would be appropriate in the R call.
Related
I'm trying to implement R in the workplace and save a bit of time from all the data churning we do.
A lot of files we receive are sent to us via SFTP as they contain sensitive information.
I've looked around on StackOverflow & Google but nothing seems to work for me. I tried using the RCurl Library from an example I found online but it doesn't allow me to include the port(22) as part of the login details.
library(RCurl)
protocol <- "sftp"
server <- "hostname"
userpwd <- "user:password"
tsfrFilename <- "Reports/Excelfile.xlsx"
ouptFilename <- "~/Test.xlsx"
url <- paste0(protocol, "://", server, tsfrFilename)
data <- getURL(url = url, userpwd=userpwd)
I end up getting the error code
Error in curlPerform(curl = curl, .opts = opts, .encoding = .encoding) :
embedded nul in string:
Any help would be greatly appreciated as this will save us loads of time!
Thanks,
Shan
Looks like a similar situation here: Using R to download SAS file from ftp-server
I'm no expert in r but there it looks like getBinaryUrl() worked instead of getURL() in the example given.
Hope that helps
M
Note that there are two packages, RCurl and rcurl. For RCurl, I used successfully keyfiles to connect via sftp:
opts <- list(
ssh.public.keyfile = pubkey, # file name
ssh.private.keyfile = privatekey, # filename
keypasswd <- keypasswd # optional password
)
RCurl::getURL(url=uri, .opts = opts, curl = RCurl::getCurlHandle())
For this to work, you need two create the keyfiles e.g. via putty or similar.
I too was having problems specifying the port options when using the getURI() and getURL() functions.
In order to specify the port, you simply add the port as port = #### instead of port(####). For example:
data <- getURI(url = url,
userpwd = userpwd,
port = 22)
Now, like #MarkThomas pointed out, whenever you get an encodoing error, try getBinaryURL() instead of getURI(). In most cases, this will allow you to download SAS files as well as .csv files econded in UTF-8 or LATIN1!!
The webpage I am trying to access downloads an attachment but header status is always 500 which I have checked through postman.
I am using download.file with wget method in R to download that file but it stops and doesn't download since the header status is 500. How can I manage to download this file irrespective of the header status?
When I access this link in browser, the file is downloaded just fine.
Edit:
Here's how I use download.file function in R:
download.file(get(url), destfile=tmpFile, method = "wget");
I tried to pass the extra argument like:
download.file(get(url), destfile=tmpFile, method = "wget", extra =getOption("content-on-error"));
and
download.file(get(url), destfile=tmpFile, method = "wget", extra =getOption("--content-on-error=0"));
but it doesn't work
Here :
get(url) points to the link:(have replaced actual key with KEY)
https://app.adroll.com/api/v1/export/all_campaigns_report?advertisable=__KEY__&reports=AllCampaignsSummary,AllCampaignsChart,AllAds,AllCampaignsSites&start_date=01-01-2013&end_date=10-05-2016&format=csv¤cy=USD
message on trying to connect:
Resolving app.adroll.com... 52.11.56.178, 52.89.249.63
Connecting to app.adroll.com|52.11.56.178|:443... connected.
HTTP request sent, awaiting response... 500 Internal Server Error
2016-10-06 16:22:21 ERROR 500: Internal Server Error.
Hi I'm trying to write a script using R where I post data to a specific weblink and get the response back. I'm not really sure how to set curl headers for Content-type and Application.
this is what I have so far:
library("RCurl")
httpPOST("http://localhost", '{"title":"ello World!"}')
I get the following error:
Error in curlOptions(..., .opts = .opts) : unnamed curl option(s):
....
If use curl from command line the data does get posted.
Is there another library in R that does CURL and JSON posts better?
I just started playing around with the Twitter Streaming API and using the command line, redirect the raw JSON reponses to a file using the command below:
curl https://stream.twitter.com/1/statuses/sample.json -u USER:PASSWORD -o "somefile.txt"
Is it possible to stay completely within R and leverage RCurl to do the same thing? Instead of just saving the output to a file, I would like to parse each response that is returned. I have parsed twitter search results in the past, but I would like to do this as each response is received. Essentially, apply a function to each JSON response.
Thanks in advance.
EDIT: Here is the code that I have tried in R (I am on Windows, unfortunately). I need to include the reference to the .pem file to avoid the error. However, the code just "runs" and I can not seem to see what is returned. I have tried print, cat, etc.
download.file(url="http://curl.haxx.se/ca/cacert.pem", destfile="cacert.pem")
getURL("https://stream.twitter.com/1/statuses/sample.json",
userpwd="USER:PWD",
cainfo = "cacert.pem")
I was able to figure out the basics, hopefully this helps.
#==============================================================================
# Streaming twitter using RCURL
#==============================================================================
library(RCurl)
library(rjson)
# set the directory
setwd("C:\\")
#### redirects output to a file
WRITE_TO_FILE <- function(x) {
if (nchar(x) >0 ) {
write.table(x, file="Twitter Stream Capture.txt", append=T, row.names=F, col.names=F)
}
}
### windows users will need to get this certificate to authenticate
download.file(url="http://curl.haxx.se/ca/cacert.pem", destfile="cacert.pem")
### write the raw JSON data from the Twitter Firehouse to a text file
getURL("https://stream.twitter.com/1/statuses/sample.json",
userpwd=USER:PASSWORD,
cainfo = "cacert.pem",
write=WRITE_TO_FILE)
Try the twitter api package for R.
install.packages('twitteR')
library(twitteR)
I think this is what you need.
I am trying to use the getURL function of RCurl Package in order to access an ASP Webpage as:
my_url <- "http://www.my_site.org/my_site/main.asp?ID=11&REFID=33"
webpage <- getURL(my_url)
but I get an Object Moved redirection error message like:
"<head><title>Object moved</title></head>\n<body><h1>Object Moved</h1>
This object may be found here.</body>\n"
I followed various suggestions like using the curlEscape URL encoding function or by setting the CURLOPT_FOLLOWLOCATION and CCURLOPT_SSL_VERIFYHOST Parameters via the curlSetOpt Function as listed in the php ssl curl : object moved error link, but the later 2 were not recognized as valid RCurl options.
Any suggestions how to overcome the issue?
Use the followlocation curl option:
getURL(u,.opts=curlOptions(followlocation=TRUE))
with added cookiefile goodness - its supposed to be a file that doesnt exist, but I'm not sure how you can be sure of that:
w=getURL(u,.opts=curlOptions(followlocation=TRUE,cookiefile="nosuchfile"))