I have been using WHO package in R to get data from WHO database without problems for the last several weeks. Yesterday I found that I could no longer do it. I reproduced the error in different machines using different R versions, running from R itself and from Rstudio, Mac and Windows alike...
Example with 2 of the variables I want to request.
library(WHO)
socio_econ <- c("WHS7_143", "WHS9_95")
SECON <- lapply(socio_econ, function(t) get_data(t))
Here's the error:
Error in get_result(url) : Internal Server Error (HTTP 500).
NOT AN ANSWER JUST SHOWING A DEBUGGING METHOD FOR THE OP
httr::with_verbose(get_data("WHS7_143"))
-> GET /gho/athena/api/GHO/WHS7_143?format=json&profile=simple HTTP/1.1
-> Host: apps.who.int
-> User-Agent: libcurl/7.51.0 r-curl/2.0 httr/1.2.1
-> Accept-Encoding: gzip, deflate
-> Cookie: TS01ac0ef4=015dd60f3e63259629be28ff562fb98a7b99c500697d6a49e2671ad07b50034231788b7dd97944f7f6fd363c9ef2b32a1a34c37a22
-> Accept: application/json, text/xml, application/xml, */*
->
<- HTTP/1.1 500 Internal Server Error
<- Date: Mon, 01 May 2017 20:46:40 GMT
<- Content-Type: text/html;charset=utf-8
<- Content-Length: 1298
<- Via: 1.1 ghodata.who.int
<- Connection: close
<- Set-Cookie: TS01ac0ef4=015dd60f3e63259629be28ff562fb98a7b99c500697d6a49e2671ad07b50034231788b7dd97944f7f6fd363c9ef2b32a1a34c37a22; Path=/
<-
Error in get_result(url) : Internal Server Error (HTTP 500).
It's definitely something happening server-side, but it could be that the API changed and the pkg has not updated yet.
Hitting the inferred URL directly:
suggests it's an Java server error on their end.
Related
I'm trying to upload large files to SharePoint Online directly through R using the Microsoft Graph API. To this end, I'm using the API's createUploadSession, documentation here.
My code looks like this:
httr::PUT(url = "https://graph.microsoft.com/v1.0/sites/.../createUploadSession",
headers = add_headers(.headers = headers),
body = httr::upload_file(file),
encode = mime::guess_type(file),
verbose())
(where 'headers' include authentication and host name, here graph.microsoft.com)
And the resultant request looks like this:
-> PUT /v1.0/sites/.../createUploadSession HTTP/1.1
-> Host: graph.microsoft.com
-> User-Agent: libcurl/7.64.1 r-curl/4.3 httr/1.4.2
-> Accept-Encoding: deflate, gzip
-> Accept: application/json, text/xml, application/xml, */*
-> Content-Type: text/plain
-> Content-Length: 4543954542
Of course, this fails:
<- HTTP/1.1 413 Request Entity Too Large
<- Content-Type: text/html
<- Server: Microsoft-IIS/10.0
<- Strict-Transport-Security: max-age=31536000
<- Date: Fri, 02 Oct 2020 12:32:29 GMT
<- Connection: close
<- Content-Length: 67
<-
since as the documentation says, we need to upload in 327,680 byte chunks. However, I was under the assumption that httr's upload_file allows for streaming. This is where I'm stuck: it looks like my request is still trying to upload this all at once, so how do I 'invoke' this streaming behavior? And is some kind of while loop required to continue sending the next chunk of data?
This functionality is now available in the Microsoft365R package. Full disclosure: I'm the author of this package.
site <- get_sharepoint_site("sitename")
site$get_drive()$upload_file("path/to/file", "destfilename")
Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 5 years ago.
Improve this question
I am running the following code in order to get send a batch file trough the Census Geocoder api. I looked at This Question and the Documentation for the API. this did not help me. I expect to get some data returned with information about the address... and not get a 404 error. I seem to be getting the error on the return call, I seem to be sending the data okay, but am not able to retrieve the data? please help me figure out why I am getting the error. thank you!
require(httr)
req <- POST("http://geocoding.geo.census.gov/geocoder/geographies/addressbatch",
body = list(
addressFile = upload_file("addresses.csv"),
benchmark = "Public_AR_Census2010",
vintage = "Census2010_Census2010"
),
encode = "multipart",
verbose())
stop_for_status(req)
content(req)
I am getting the following output
-> POST /geocoder/geographies/addressbatch HTTP/1.1
-> Host: geocoding.geo.census.gov
-> User-Agent: libcurl/7.54.1 r-curl/2.8.1 httr/1.3.1
-> Accept-Encoding: gzip, deflate
-> Accept: application/json, text/xml, application/xml, */*
-> Content-Length: 615
-> Content-Type: multipart/form-data; boundary=------------------------c0a7880f53fb0ca4
->
>> --------------------------c0a7880f53fb0ca4
>> Content-Disposition: form-data; name="addressFile";
filename="addresses.csv"
>> Content-Type: text/csv
>>
>> "Unique_ID","Street address","City","State","ZIP"
>> 1,"125 Worth Street","New York","NY","10013"
>> 2,"258 Broadway","New York","NY","10007"
>> 3,"8 Centre Street","New York","NY","10007"
>>
>> --------------------------c0a7880f53fb0ca4
>> Content-Disposition: form-data; name="benchmark"
>>
>> Public_AR_Census2010
>> --------------------------c0a7880f53fb0ca4
>> Content-Disposition: form-data; name="vintage"
>>
>> Census2010_Census2010
>> --------------------------c0a7880f53fb0ca4--
<- HTTP/1.0 302 Found
<- Location:
https://geocoding.geo.census.gov/geocoder/geographies/addressbatch
<- Server: BigIP
<- Connection: Keep-Alive
<- Content-Length: 0
<-
-> GET /geocoder/geographies/addressbatch HTTP/1.0
-> Host: geocoding.geo.census.gov
-> User-Agent: libcurl/7.54.1 r-curl/2.8.1 httr/1.3.1
-> Accept-Encoding: gzip, deflate
-> Accept: application/json, text/xml, application/xml, */*
->
<- HTTP/1.1 404 Not Found
<- Cache-Control: no-cache, no-store, max-age=0
<- Connection: close
<- Date: Thu, 25 Jan 2018 22:03:01 GMT
<- Pragma: no-cache
<- Content-Type: application/json
<- Expires: Wed, 31 Dec 1969 23:59:59 GMT
<- Content-Language: en-US
<- Vary: Origin
<-
> stop_for_status(req)
Error: Not Found (HTTP 404).
> content(req)
named list()
Actually if you change from http to https, your code will work.
You might also want to check out the censusr package.
Very strange issue. I am trying to connect to some API (inside organization) which require first to POST key & code to some url and then use that cookie to get the desired data (json).
Running the POST request returns status 200 which is good - but no cookie is returned.
Running the same request in Firefox using "httprequester" returns a cookie as expected and works fine.
url<-"https://some_url"
login <- list(
Key="some_key",
Code="some_code"
)
try_temp<-POST(url = url,body=login,encode="form",verbose())
Result is:
-> POST /api/Service/Login HTTP/1.1
-> Host: **************
-> User-Agent: libcurl/7.53.1 r-curl/2.5 httr/1.2.1
-> Accept-Encoding: gzip, deflate
-> Accept: application/json, text/xml, application/xml, */*
-> Content-Type: application/x-www-form-urlencoded
-> Content-Length: 43
->
>> Key=*****&Code=*******
<- HTTP/1.1 200 OK
<- Content-Type: text/html; charset="utf-8"
<- Content-Length: 6908
<- Connection: Close
<-
Thing is, that the same request works when down in browser.
GET request (after I know the cookie, I use GET in httr passing the cookie i've got. I get the same log as above.
BTW When instead of GET i use BROWSE - R opens the default browser and i see the expected data returned.
I suspect that some of the settings for R are not the same as for Firefox (or any other browser). We don't use PROXY but rather automatic configuration script.
Tnx
I am going through this documentation from twitter to follow someone. I have authorized the account using the twitteR package with api_key , access_token etc. As this is a POST operation I decided to use httr package in R. One of the example provided in the documentation is
https://api.twitter.com/1.1/friendships/create.json?user_id=1401881&follow=true
So accordingly , just changed the user_id to that of the account which I want to follow.
library(httr)
POST("https://api.twitter.com/1.1/friendships/create.json?user_id=1401881&follow=true",verbose())
where 1401881 is the id which I want to follow.
This gives me
-> POST /1.1/friendships/create.json?user_id=1401881&follow=true HTTP/1.1
-> User-Agent: libcurl/7.39.0 r-curl/0.9.1 httr/1.1.0
-> Host: api.twitter.com
-> Accept-Encoding: gzip, deflate
-> Cookie: guest_id=v1%3A146475568975546263
-> Accept: application/json, text/xml, application/xml, */*
-> Content-Length: 0
->
<- HTTP/1.1 400 Bad Request
<- content-encoding: gzip
<- content-length: 87
<- content-type: application/json; charset=utf-8
<- date: Wed, 01 Jun 2016 05:15:42 GMT
<- server: tsa_b
<- strict-transport-security: max-age=631138519
<- x-connection-hash: 6abd7db7f4c47058bf9d96e9ae23fb83
<- x-response-time: 5
<-
Response [https://api.twitter.com/1.1/friendships/create.json? user_id=1401881&follow=true]
Date: 2016-06-01 05:15
Status: 400
Content-Type: application/json; charset=utf-8
Size: 62 B
As can be seen in the response message it says Bad Request from which I believe the URL which I have generated is wrong. I also tried with
POST("https://api.twitter.com/1.1/friendships/create", verbose(),
body = list(user_id = "101311381"), encode = "json")
I have tried various other ways and tried googling as well but cannot find a solution to this. Any help would be appreciated.
Try adding your oauth_token (generated from the twitteR package) to the POST request
library(httr)
POST("https://api.twitter.com/1.1/friendships/create.json?user_id=1401881&follow=true",
config(token = oauth_token))
yI'm using the {httr} package to log into an internal web application (Theradoc on IIS7.5) in order to scrape some html (infection) data.
library(httr)
POST("http://ahdc372n2", authenticate("foo", "bar"), encode="multipart"), verbose())
The verbose console output says,
-> POST /theradoc/login/index.cfm HTTP/1.1
-> Authorization: Basic Y2xpbmludGVsbDowMWRFbmdsaXNo
-> User-Agent: curl/7.19.6 Rcurl/1.95.4.3 httr/0.4
-> Host: ahdc372n2.phs-sfalls.amck.net
-> Accept: */*
-> Accept-Encoding: gzip
-> Cookie: JSESSIONID=843052421c871dec2ac3a263b136d475a4a6
->
<- HTTP/1.1 411 Length Required
<- Content-Type: text/html; charset=us-ascii
<- Server: Microsoft-HTTPAPI/2.0
<- Date: Mon, 08 Sep 2014 15:53:02 GMT
<- Connection: close
<- Content-Length: 344
<-
* Closing connection #0
And ultimately I get an ">HTTP Error 411. The request must be chunked or have a content length."
I've reviewed this older post without and useful pointers.
Is there a way to force the Content-Length in the httr POST request?
UPDATE : Manually installing httr_0.5 from the zip archive seems to have solved the problem. Thank you hrbmstr.