R how do I switch off JSON outgoing/incoming header messages? - r

I have an issue using an R script as a data source in Microsoft PowerBi. I think this is fundermentally an issue with PowerBi, but in the short term I'll need to find a solution in R.
Essentially, PowerBi doesn't appear to be able to handle the messages that would be sent to the console if I was using R Studio.
Within the R script I'm using a REST API to request data from a URL. The JSON message that is received is converted into an R data frame. When using the script as a datasource in PowerBi, this only works if I set the verbose settings to FALSE i.e. if I was using R Studio no messages (in particular data in) are sent to the console.
response <- GET(<url>,
body = list(),
add_headers(.headers = c('<identity token>' = ID_to_use)),
verbose(data_out = FALSE,
data_in = FALSE,
info = FALSE,
ssl = FALSE),
encode = "json")
However, I do not have the option to switch off the incoming/outgoing JSON header messages (which is going to come back to bite!).
<< {"identity":" <token>"}
* Connection #54 to <host> left intact
No encoding supplied: defaulting to UTF-8.
-> GET <URL request> HTTP/1.1
-> Host: <host>
-> User-Agent: libcurl/7.64.1 r-curl/4.3 httr/1.4.1
-> Accept-Encoding: deflate, gzip
-> Accept: application/json, text/xml, application/xml, */*
-> <Identity>: <Identity>
->
<- HTTP/1.1 200 OK
<- X-Session-Expiry: 3599
<- Content-Type: application/json
<- Transfer-Encoding: chunked
<- Date: Thu, 06 Aug 2020 16:14:26 GMT
<- Server: <Server>
<-
No encoding supplied: defaulting to UTF-8.
No encoding supplied: defaulting to UTF-8.
No encoding supplied: defaulting to UTF-8.
From R help
.
.
verbose() uses the following prefixes to distinguish between different components of the http messages:
* informative curl messages
-> headers sent (out)
>> data sent (out)
*> ssl data sent (out)
<- headers received (in)
<< data received (in)
<* ssl data received (in)
.
.
Switching the verbose settings to FALSE works for a single request, however, I need to put the request into a loop and keep requesting more data until the API gateway indicates there is no more data to be received. PowerBi appears to fail when in the script five or more request/replies are sent/received.
Just from observation, I assume this is to do with the JSON Header messages piling up.
I've tried a number of approaches but nothing seems to work: sink('NUL'), invisible(), capture.output().
Any help would be appreciated.

I found a hacky solution, which at least solved the problem I had in R, but not in PowerBi.
By writing a "wrapper" R script (see below) which calls my main script THE_SCRIPT.R using a shell command. THE_SCRIPT dumps out a CSV file, which I then read in the wrapper script:
#Required by PowerBi
library(mice)
#set the directory, between R and the shell it's a pain to deal with spaces in the directories and quotes
setwd("C:/Program Files/R/R-3.6.2/bin/")
system("Rscript.exe C:\\Users\\<USER>\\Documents\\THE_SCRIPT.R > Nul 2>&1")
A_DATA_TABLE <- read.csv("C:\\Users\\<USER>\\Documents\\THE_FILE.csv")
However, this still didn't resolve the issue when running it in PowerBi.
Note, I tried sink('Nul 2>&1') in R, didn't work.

Related

http.ResponseWriter returns carriage return in body

I'm doing a coding exercise developing a HTTP server using Go's net/http library. The server is supposed to pass a series of tests in a Gitlab pipeline. I have no access to these tests and I can't see how they are implemented.
The problem is that one test for an expected HTTP 204 No Content response fails as follows:
Expected an empty response body "", got "\n"
The way I build the response in my code is:
// w is the http.ResponseWriter of the handler function.
w.WriteHeader(http.StatusNoContent)
w.Header().Del("Content-Type")
w.Write(nil)
I also tried w.Write(make([]byte, 0)) with the same result.
I'm testing it locally with curl but I can't really see the characters that are being returned in the body:
$ curl -i --header "Content-Type: application/x-www-form-urlencoded" --request POST --data "PARAMETER=1" host:9000/path
HTTP/1.1 204 No Content
Date: Thu, 10 Sep 2020 16:12:21 GMT
$
Is the net/http server actually returning a carriage return, and how can I prevent this?. Thank you.
Sorry, I was looking at the wrong piece of code. Because I don't have any details about the tests, I don't really know what exact case is being tested. The comments above are correct, just using w.WriteHeader(http.StatusNoContent) doesn't produce any carriage return in the body. No need to delete Content-Type. My mistake was that I was using http.Error(w, "", http.StatusNoContent) instead.

MWS API Signature does not match R

I am trying to get data from Amazon MWS API using GetMatchingProductForId operation.
When I use Amazon MWS Scratchpad it works perfectly fine.
I am now trying to replicate the urls that are sent in the HTTP POST request but I get a Signature error message.
I need to understand how the url request should be structured.
Below is the detail of the request in Amazon MWS Scratchpad, I ANONYMIZED private identifiers but that is the only thing I changed:
HTTP POST
POST /Products/2011-10-01?AWSAccessKeyId=ANONYMIZED
&Action=GetMatchingProductForId
&SellerId=ANONYMIZED
&SignatureVersion=2
&Timestamp=2018-09-28T05%3A45%3A43Z
&Version=2011-10-01
&Signature=ANONYMIZED
&SignatureMethod=HmacSHA256
&MarketplaceId=A13V1IB3VIYZZH
&IdType=EAN
&IdList.Id.1=9781933988665 HTTP/1.1
Host: mws.amazonservices.fr
x-amazon-user-agent: AmazonJavascriptScratchpad/1.0 (Language=Javascript)
Content-Type: text/xml
String to Sign
POST
mws.amazonservices.fr
/Products/2011-10-01
AWSAccessKeyId=ANONYMIZED&Action=GetMatchingProductForId&IdList.Id.1=9781933988665&IdType=EAN&MarketplaceId=A13V1IB3VIYZZH&SellerId=ANONYMIZED&SignatureMethod=HmacSHA256&SignatureVersion=2&Timestamp=2018-09-28T05%3A45%3A43Z&Version=2011-10-01
=======
Now my question is, (and let's imagine my signature was created correctly), from the HTTP POST, what should the request look like ?
Here is my attempt:
https://mws.amazonservices.fr/Products/2011-10-01?AWSAccessKeyId=ANONYMIZED&Action=GetMatchingProductForId&SellerId=ANONYMIZED&SignatureVersion=2&Timestamp=2018-09-28T05%3A52%3A33Z&Version=2011-10-01&Signature=ANONYMIZED&SignatureMethod=HmacSHA256&MarketplaceId=A13V1IB3VIYZZH&IdType=EAN&IdList.Id.1=9781933988665
But what about '\n' escape characters that are in the scratchpad ? and what about 'HTTP/1.1' at the end, should I include that also ?
Thanks for your help.
I don't have an MWS account so I can't test the following, but this is one way you can do it:
# set this to your python2 binary; you'll need to do
# pip2 install boto
# from a command-line before using this code
Sys.setenv("RETICULATE_PYTHON"="/usr/bin/python2.7")
library(reticulate)
boto_mws_connection <- import("boto.mws.connection")
con <- boto_mws_connection$MWSConnection(
aws_access_key_id = ACCESS_KEY
aws_secret_access_key = AWS_SECRET
Merchant = MERCHANT_ID
)
con$get_matching_product_for_id(
MarketplaceId = "A13V1IB3VIYZZH",
IdType = "EAN",
IdList = c("9781933988665")
)
The HTTP/1.1 is usually created by your http client library. I am not familiar with R, but I googled and it seems there is a CURL package for R. CURL is the standard http library for a lot of languages including PHP. My PHP code to send an XML feed through curl looks like this:
$ch = curl_init();
curl_setopt($ch,CURLOPT_URL, 'https://mws.amazonservices.fr/Products/2011-10-01?.....your data and signature here...');
curl_setopt($ch,CURLOPT_POST, 1);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_POSTFIELDS, $xmlcontent);
curl_setopt($ch,CURLOPT_HTTPHEADER, array(
"Content-Type: text/xml",
"Content-MD5: ".base64_encode(md5($xmlcontent,true)),
"x-amazon-user-agent: TestScript/0.01")
);
$result = curl_exec($ch);
curl_close($ch);
By looking at this, it seems to me this should be easily translatable to the R interface for CURL.

AT commands Quectel MC60

I've just started working with the Quectel MC60 and I am having some issues:
About HTTP GET method, I make the following commands:
AT+QIFGCNT=0
AT+QICSGP=1,"my_apn"
AT+QIREGAPP
AT+QIACT
AT+QSSLCFG="https",1
AT+QHTTPURL=39,40
my_url_39_bytes_long
AT+QHTTPGET=60
AT+QHTTPREAD=30
AT+QIDEACT
When using the QCOM software, I make a script running all the above commands sequentially. When it comes to the AT+QHTTPREAD command, the response is always "+CME ERROR: 3822" (HTTP response failed). What can it be? I'm sure the HTTP server is working properly.
The answer is that it is necessary to configure the request header
AT+QIFGCNT=0
AT+QICSGP=1,"my_apn"
AT+QIREGAPP
AT+QIACT
AT+QHTTPURL=39,40
my_url_39_bytes_long
AT+QHTTPCFG="requestheader",1
AT+QHTTPPOST=77
GET path HTTP/1.1
User-Agent: Fiddler
Host: www.my_host.com
AT+QHTTPREAD=30
AT+QIDEACT
NOTE: in AT+HTTPPOST=77, 77 is the size of the POST message (last two \r\n are required and count)
NOTE2: after GET you're supposed to write the path to the url inserted in AT+QHTTPURL. For example, if you specified your URL as https://www.my_host.com/debug/main/port, your AT+HTTPPOST request should look like this (don't forget the last two \r\n):
GET /debug/main/port HTTP/1.1
User-Agent: Fiddler
Host: www.my_host.com

R - httr - Twitter Error 401 using GET providing Authorization Error

This is the code that provides the error, along with the output from it.
I'm positive my access keys and tokens are correct. I triple checked them.
I'm guessing my query may be wrong somehow? My guess was defaulting since_id=0 for my first run, but removal of that provides the same error.
mentions = GET(final_url, sig)
mentions
Response [https://api.twitter.com/1.1/search/tweets.json?q=#lolhauntzer&until=2016-01-20&since_id=0&result_type=recent&lang=en&count=100]
Date: 2016-01-19 05:09
Status: 401
Content-Type: application/json; charset=utf-8
Size: 64 B
Woops. Brain lapse. Need to replace the "#" in the URL with "%40". The "#" works on my other workstation though, which is kind of baffling right now.

.Net S3 Client hang while attempting to CopyObject

Have an issue with asp.net s3 Object Copying using the asp.net s3 library.
I'm attempting to run the following code :-
CopyObjectRequest copyObjectRequest = new CopyObjectRequest()
{
SourceBucket = Bucket,
SourceKey = s3FileKey,
DestinationBucket = Bucket,
DestinationKey = archiveS3FileKey
};
CopyObjectResponse copyObjectResponse = s3Client.CopyObject(copyObjectRequest);
The request will just hang on the s3Client.CopyObject(copyObjectRequest); line.
Using fiddler/wireshark i extract the http header seen below (proper bucket and auth keys removed)
PUT https://bucket-name.s3-ap-southeast-2.amazonaws.com/Archive/test.zip HTTP/1.1
x-amz-copy-source: /bucket-name/App/test.zip
x-amz-metadata-directive: COPY
User-Agent: MyDotNetConsoleApp
Content-Type: application/x-amz-json-1.0
x-amz-date: Mon, 30 Jun 2014 05:34:53 GMT
Authorization: AWS <authorization code>
Host: bucket-name.s3-ap-southeast-2.amazonaws.com
Transfer-Encoding: chunked
When I alter the request by removing the Transfer-Encoding: chunked the request runs fine. See below alteration to the s3 library from aws.
And the object will be copied to the new key
To get around this I added the following to the s3 Client library as a temporary measure, obviously though I would like to understand properly, if I'm not setting something in the client request.
(line 281 in AmazonWebClientService.cs)
if (state.WebRequest.Headers.Get("x-amz-copy-source") != null)
{
state.WebRequest.SendChunked = false;
}
// this was the existing line above added.
httpResponse = state.WebRequest.GetResponse() as HttpWebResponse;
I can delete object from the same s3Client object leading up to this copyobject, and also full permissions have been granted to this access key on this bucket.
Anyone have any idea what would cause this or how to get around it without altering the standard library supplied from aws ?

Resources