Following the advice from this question, and the API documentation (e.g., Blob service REST API), I have an x-ms-version specified in the header. My code works against Azurite, and is authenticated by Azure, but returns a
HTTP/1.1 400 The value for one of the HTTP headers is not in the correct format.
The xml body has more details:
<Error>
<Code>InvalidHeaderValue</Code>
<Message>The value for one of the HTTP headers is not in the correct format.
RequestId:d39d2cad-301e-009e-1546-3940de000000
Time:2018-08-21T11:58:14.2369850Z</Message>
<HeaderName>x-ms-version</HeaderName>
<HeaderValue>2017-01-19</HeaderValue>
</Error>
My guess is it is not the format, but the value. How do I find the right value, or even a list of all possible values that I can try running one at a time, or does someone know that this is a misleading error, and I need to be looking somewhere else?
While playing with this a little more a x-ms-version=2015-02-21 generates.
HTTP/1.1 400 One of the request inputs is out of range.
Here is the request:
-> GET /mike-ecu-test?restype=container&comp=list HTTP/1.1
-> Host: mikeecutest.blob.core.windows.net
-> User-Agent: libcurl/7.54.0 r-curl/3.1 httr/1.3.1
-> Accept-Encoding: gzip, deflate
-> Accept: application/json, text/xml, application/xml, */*
-> Authorization: SharedKey mikeecutest/mike-ecu-test:9j5XodD9OIslzMnzHXiU7c76EpOXFi5jeQITbHk/Y8g=
-> x-ms-date: Wed, 22 Aug 2018 01:35:06 GMT
-> x-ms-version: 2018-03-28
Here is the r code that generates it, credit for the code is answer 4 to this question
library("httr")
azureout <- function(){
url <- "http://mikeecutest.blob.core.windows.net/mike-ecu-test?restype=container&comp=list"
sak <- "dfgwhsfhsfg.....hjdkfgs=="
requestdate<-format(Sys.time(),"%a, %d %b %Y %H:%M:%S %Z", tz="GMT")
msapiversion<- "2018-03-28"
signaturestring<-paste0("GET",paste(rep("\n",12),collapse=""),
"x-ms-date:",requestdate,
"x-ms-version:",msapiversion,"\n",
"mikeecutest", "\n",
"comp:list","\n",
"restype:container")
headerstuff<-add_headers(Authorization=paste0("SharedKey mikeecutest/mike-ecu-test:",
RCurl::base64(digest::hmac(key=RCurl::base64Decode(sak, mode="raw"),
object=enc2utf8(signaturestring),
algo= "sha256", raw=TRUE))),
`x-ms-date`=requestdate,
`x-ms-version`=msapiversion)
content(GET(url,config = headerstuff, verbose() ))
}
Storage API version 2018-03-28 generates a:
<- HTTP/1.1 400 One of the request inputs is out of range.
Storage API version 2018-02-01 generates a:
<- HTTP/1.1 400 The value for one of the HTTP headers is not in the correct format.
You are right, 2017-01-19 is not a valid Storage service version, see all versions here. The documentation article Versioning for the Azure Storage services also suggests the latest API version at the top.
Recommend you to use the latest if there is no specific requirement.
Update
See three points to fix:
signaturestring<-paste0("GET",paste(rep("\n",12),collapse=""),
"x-ms-date:",requestdate,"\n", # miss "\n"
"x-ms-version:",msapiversion,"\n",
"/mikeecutest/mike-ecu-test", "\n", # should be /accountname/containername
"comp:list","\n",
"restype:container")
headerstuff<-add_headers(Authorization=paste0("SharedKey mikeecutest:", # only need accountname here
RCurl::base64(digest::hmac(key=RCurl::base64Decode(sak, mode="raw"),
object=enc2utf8(signaturestring),
algo= "sha256", raw=TRUE))),
`x-ms-date`=requestdate,
`x-ms-version`=msapiversion)
Package of REST API is always preferred, you can have a try at that provided by #Hong.
Consider using my AzureStor package which is an interface to file and blob storage on Azure. It handles authentication, including both access key and SAS, and admin details like getting the API version right.
install.packages("AzureStor")
library(AzureStor)
bl <- blob_endpoint("http://mikeecutest.blob.core.windows.net",
key="your_key")
cont <- blob_container(bl, "mike-ecu-test")
list_blobs(cont)
upload_blob(cont, "srcfile", "destblob") # blocked upload is supported for blob storage
download_blob(cont, "srcblob", "destfile")
newcontainer <- create_blob_container(bl, "newcontainer")
delete_blob_container(newcontainer)
Related
I'm trying to upload large files to SharePoint Online directly through R using the Microsoft Graph API. To this end, I'm using the API's createUploadSession, documentation here.
My code looks like this:
httr::PUT(url = "https://graph.microsoft.com/v1.0/sites/.../createUploadSession",
headers = add_headers(.headers = headers),
body = httr::upload_file(file),
encode = mime::guess_type(file),
verbose())
(where 'headers' include authentication and host name, here graph.microsoft.com)
And the resultant request looks like this:
-> PUT /v1.0/sites/.../createUploadSession HTTP/1.1
-> Host: graph.microsoft.com
-> User-Agent: libcurl/7.64.1 r-curl/4.3 httr/1.4.2
-> Accept-Encoding: deflate, gzip
-> Accept: application/json, text/xml, application/xml, */*
-> Content-Type: text/plain
-> Content-Length: 4543954542
Of course, this fails:
<- HTTP/1.1 413 Request Entity Too Large
<- Content-Type: text/html
<- Server: Microsoft-IIS/10.0
<- Strict-Transport-Security: max-age=31536000
<- Date: Fri, 02 Oct 2020 12:32:29 GMT
<- Connection: close
<- Content-Length: 67
<-
since as the documentation says, we need to upload in 327,680 byte chunks. However, I was under the assumption that httr's upload_file allows for streaming. This is where I'm stuck: it looks like my request is still trying to upload this all at once, so how do I 'invoke' this streaming behavior? And is some kind of while loop required to continue sending the next chunk of data?
This functionality is now available in the Microsoft365R package. Full disclosure: I'm the author of this package.
site <- get_sharepoint_site("sitename")
site$get_drive()$upload_file("path/to/file", "destfilename")
Request to DailySalesReport service in development enviromant:
<ns6:DailySalesReportRQ Version="2.0.0"
xmlns:ns1="http://www.ebxml.org/namespaces/messageHeader"
xmlns:ns2="http://www.w3.org/1999/xlink"
xmlns:ns3="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:ns4="http://schemas.xmlsoap.org/ws/2002/12/secext"
xmlns:ns6="http://webservices.sabre.com/sabreXML/2011/10"
xmlns:ns7="http://services.sabre.com/STL/v01"
xmlns:ns8="http://services.sabre.com/STL_Header/v120" xmlns:ns9="http://www.w3.org/2000/09/xmldsig#">
<ns6:SalesReport PseudoCityCode="5VYJ" StartDate="2020-02-13"/>
</ns6:DailySalesReportRQ>
Receive: ERR.SWS.HOST.ERROR_IN_RESPONSE with "TICKETING DATABASE ERROR"
What does the TICKET DATABASE ERROR mean and how can I fix it?
First of all u must check if the date u request have any doc issued.
Below the RAW request for the service="DailySalesReportLLSRQ" which i do now!
POST https://webservices.havail.sabre.com/websvc HTTP/1.1
Accept-Encoding: gzip,deflate
Content-Type: text/xml;charset=UTF-8
SOAPAction: "DailySalesReportLLSRQ"
Content-Length: 1503
Host: webservices.havail.sabre.com
Connection: Keep-Alive
User-Agent: Apache-HttpClient/4.1.1 (java 1.5)
Shared/IDL:IceSess/SessMgr:1.0.IDL/Common/!ICESMS/RESA!ICESMSLB/RES.LB!-2917635922269276274!1343176!0!1
soapui.gwaereo#cvccorp.com.br
https://webservices.havail.sabre.com/websvc
_
soapui.gwaereo#cvccorp.com.br
DailySalesReport
DailySalesReportLLSRQ
3912336541697520620
2020-02-21T15:02:49
https://developer.sabre.com/eticketcouponllsrq mentions:
Common errors
TICKETING DATABASE ERROR:
The ticket has been issued from PCC XXX and It can be viewed only from this PCC.
Please remove the conjunctive ticket number in your request and you should be able to get a good response.
Carrier does not seem to have the image of this ticket available.
Seems the service is orchestrated so it is impossible to identify whi PNR / ticket causes a trouble. You need to contact Sabre support and provide a request body + a date of error.
I am trying to use R to access a web page in our organization using httr GET.
However i get a message that "Access is denied due to invalid credentials".
I can do the desired action manually.
it seems that authorization is done automatically when i use internet explorer to get to the web site but the access is blocked when trying to do the same action through R.
This is how i'm trying to do it:
(I can't supply the exact address because it's an intranet address which can be used only inside the organization the same for the proxy address)
library(httr)
r <- GET(myurl, useproxy(myproxyid, 80), verbose())
-> GET http: //myurl
-> host: xxx
-> User-Agent : libcurl...
-> Accept-Encoding: gzip, deflate
-> Proxy-Connection: Keep-Alive
-> Accept: application/json, text/xml, application/xml, *.*
<- HTTP/1.1 401 Unauthorized
<- Content-Type: text/html
<- Server: Microsoft-IIS/8.5
<- WWW-Authenticate: Negotiate
<- WWW-Authenticate: NTLM
<- X-Powered-By: ASP.NET
r
Response [myurl]
Date
Status: 401
...
<title>401 - Unauthorized: Access is denied due to invalid credentials.</title>
....
I understand that i somehow have to send my credentials with my request.
is it possible to somehow use automatic authentication?
Thanks
Rafael
OK
I finally made it.
I had to supply authentication data to my get command like this:
library(httr)
r <- GET(myurl,
useproxy(myproxyid, 80),
verbose(),
authenticate(user = "myuserid", password = "mypassword", type = "ntlm"))
I hope it helps anybody
Thanks
Rafael
Very strange issue. I am trying to connect to some API (inside organization) which require first to POST key & code to some url and then use that cookie to get the desired data (json).
Running the POST request returns status 200 which is good - but no cookie is returned.
Running the same request in Firefox using "httprequester" returns a cookie as expected and works fine.
url<-"https://some_url"
login <- list(
Key="some_key",
Code="some_code"
)
try_temp<-POST(url = url,body=login,encode="form",verbose())
Result is:
-> POST /api/Service/Login HTTP/1.1
-> Host: **************
-> User-Agent: libcurl/7.53.1 r-curl/2.5 httr/1.2.1
-> Accept-Encoding: gzip, deflate
-> Accept: application/json, text/xml, application/xml, */*
-> Content-Type: application/x-www-form-urlencoded
-> Content-Length: 43
->
>> Key=*****&Code=*******
<- HTTP/1.1 200 OK
<- Content-Type: text/html; charset="utf-8"
<- Content-Length: 6908
<- Connection: Close
<-
Thing is, that the same request works when down in browser.
GET request (after I know the cookie, I use GET in httr passing the cookie i've got. I get the same log as above.
BTW When instead of GET i use BROWSE - R opens the default browser and i see the expected data returned.
I suspect that some of the settings for R are not the same as for Firefox (or any other browser). We don't use PROXY but rather automatic configuration script.
Tnx
I'm trying to establish a connection using kerberos authentication. I think the question I have does not depend on the type of server (in my case it's a cognos tm1 server) nor the language (in my case R with use of the package httr (or RCurl)) since it's more a general http(s) thing.
I do not have much experience using kerberos. According to my understanding there is some negotiation between the client and server following the following steps (here get-requests). The only thin I need to pass is a username, no password is needed.
get(url) -> returning a "WWW-Authenticate: Kerberos" header telling this authmethod is supported.
get(url, header = Authentification: "Negotiate" + token) --> second request, this time with header information "Negotiate" plus token
Server returns some authentification details.
Received details can be sent in the header again and the requested data is sent back
httr (type = gssnegotiate) or curl (4 = CURLAUTH_NEGOTIATE) allow to enter negotiation types. I thought, this should do the negotiation process described above and return the requested data straight ahead. This does not seem to be the case:
library(httr)
httr::set_config(config( ssl_verifypeer = 0L))
httr::set_config(config( ssl_verifyhost = 0L))
GET(url, authenticate(user = "user", password = "", type = "gssnegotiate"), verbose = TRUE)
does not return the desired result. The log says:
-> GET /api/v1/Dimensions('Time')/Hierarchies('Time')/Subsets('Yesterday')/Elements HTTP/1.1
-> Host: myhostaddress.com:20049
-> User-Agent: libcurl/7.47.1 r-curl/1.2 httr/1.2.1
-> Accept-Encoding: gzip, deflate
-> Cookie: TM1SessionId=tbQcdXh4PsIHUQdkW_UyNQ
-> Accept: application/json, text/xml, application/xml, */*
->
<- HTTP/1.1 401 Unauthorized
<- Content-Type: text/plain
<- Content-Length: 0
<- Connection: keep-alive
<- OData-Version: 4.0
<- WWW-Authenticate: Kerberos
<-
* Connection #0 to host myhostaddress.com left intact
I tried the same using (R)curl
library(RCurl)
getURL(url, user = "username", userpwd="", httpauth = 4, verbose = TRUE, ssl.verifypeer = FALSE, ssl.verifyhost = FALSE)
Unfortunately, this wasn't successful as well:
< HTTP/1.1 401 Unauthorized
< Content-Type: text/plain
< Content-Length: 0
< Connection: keep-alive
< OData-Version: 4.0
< Set-Cookie: TM1SessionId=WMSrJHGTps0RIbmjCCaW5w; Path=/api/; HttpOnly; Secure
< WWW-Authenticate: Kerberos
Do you have any hints how I could get the desired data? I was also thinking about implementing the steps described above manually. By I'm stuck in step 2, because I do not have a token to send in the negotiation header (and do also not know where to get it from).
This won't work because the server requires WWW-Authenticate: Kerberos, but curl only talks SPNEGO. Modify your server to request WWW-Authenticate: Negotiate and it will work.
Note: no major browser supports pure Kerberos over HTTP, so don't expect any other library to do so.
On Windows, you can use
library(httr)
GET(url, authenticate(user=":", password="", type="gssnegotiate"), verbose = TRUE)
Or if no proxy is required for an internal website, state no proxy explictly as follows:
library(httr)
GET(url, use_proxy(""), authenticate(user=":", password="", type="gssnegotiate"), verbose = TRUE)