Send request with body with openresty lua-resty-http module - nginx

I am trying to send request via http module lua-resty-http. How I can send request with body data.
I have tried this
hc:connect("127.0.0.1", 82)
dates = ngx.req.get_post_args()
local hc = http:new()
result, errors = hc:request{
path = requrl,
method = "POST",
body = dates,
headers = {
["Host"] = "localhost",
},
}
Basically I am trying to send a lua table to another server location. And how to capture on that lua table location.
I'd appreciate a detailed explanation.

ngx.req.get_post_args() returns a table of key, value pairs. The body argument for the http client's request function must be in a format supported by OpenResty's cosocket send API. This means either a string, or array like table holding strings.
If you want to send a lua table with an HTTP request then you'll need a way to encode it to a string. A common approach is using JSON, and you can do this with the bundled cjson library:
local json = require "cjson"
local dates = ngx.req.get_post_args()
hc:request {
body = json.encode(dates),
...
}

Related

Python requests for Google Book API. How to create url?

I'm using this API to search through books. I need to create a request with given parameters. When I use requests library and params argument it creates bad URL which gives me wrong response. Let's look at the examples:
import requests
params = {'q': "", 'inauthor': 'keyes', 'intitle': 'algernon'}
r = requests.get('https://www.googleapis.com/books/v1/volumes?', params=params)
print('URL', r.url)
The URL is https://www.googleapis.com/books/v1/volumes?q=&inauthor=keyes&intitle=algernon
Which works but gives a different response than when the link is as Working Wolumes tells.
Should be: https://www.googleapis.com/books/v1/volumes?q=inauthor:keyes+intitle:algernon
Documentation of requests tells only about params and separates them with &.
I'm looking for a library or any solution. Hopefully, I don't have to create them using e.g. f-strings
You need to create a parameter to send the url, the way you are doing it now is not what you wanted.
In this code you are saying that you need to send 3 query parameters, but that is not what you wanted. You actually want to send 1 parameter with a value.
import requests
params = {'q': "", 'inauthor': 'keyes', 'intitle': 'algernon'}
r = requests.get('https://www.googleapis.com/books/v1/volumes?', params=params)
print('URL', r.url)
try below code instead which is doing what you require:
import requests
params = {'inauthor': 'keyes', 'intitle': 'algernon'}
new_params = 'q='
new_params += '+'.join('{}:{}'.format(key, value) for key, value in params.items())
print(new_params)
r = requests.get('https://www.googleapis.com/books/v1/volumes?', params=new_params)
print('URL', r.url)

Microsoft Cognitive Service Internal Server Error for /vision/v1.0/analyze

I made a small program to test Microsoft Cognitive Service, but it always return
{
"code":"InternalServerError",
"requestId":"6d6dd4ec-9840-4db3-9849-a6497094fa4c",
"message":"Internal server error."
}
The code I'm using is:
#!/usr/bin/env python
import httplib, urllib, base64
headers = {
# Request headers
'Content-Type': 'application/json',
'Ocp-Apim-Subscription-Key': '53403359628e420ab85a516a79ba1bd0',
}
params = urllib.urlencode({
# Request parameters
'visualFeatures': 'Categories,Tags,Adult,Description,Faces',
'details': '{string}',
})
try:
conn = httplib.HTTPSConnection('api.projectoxford.ai')
conn.request("POST", "/vision/v1.0/analyze?%s" % params,
'{"url":"http://static5.netshoes.net/Produtos/bola-umbro-neo-liga-futsal/28/D21-0232-028/D21-0232-028_zoom1.jpg?resize=54g:*"}', headers)
response = conn.getresponse()
data = response.read()
print(data)
conn.close()
except Exception as e:
print("[Errno {0}] {1}".format(e.errno, e.strerror))
Am I doing something wrong or it's a generalized server problem?
The problem is in the params variable. When defining which visual features you would like to extract you can specify specific details from the image, as described in the documentation. The details field, if used, must be initialized with one of the valid string options available (currently, only supporting the "Celebrities" option, that would identify which celebrity is in the image). In this case, you initialized the details field with literally the placeholder noted in the documentation ('{string'}). That caused the system to give an internal error.
To correct that, you should try:
params = urllib.urlencode({
# Request parameters
'visualFeatures': 'Categories,Tags,Adult,Description,Faces',
'details': 'Celebrities',
})
(PS: Have already reported this behavior to Microsoft Cognitive Services.)

How do I inject new request header with json data in proxy request flow?

I am trying to inject a new request header in the proxy request flow using JS policy to be sent to the backend server. When I look at the debug trace, I see that the json data in the request header is distorted.
I am trying to inject some string like
{"scope":"","time_till":2264,"id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM","custom_data":{"c_id":"test_data"}}
But when I look at the trace window I see this
{"scope":"","time_till":2264,id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM,"custom_data":{"c_id":"test_data"}}
what am I doing wrong?
var obj = {"scope":"","time_till":2264,"id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM","custom_data":{"c_id":"test_data"}};
var header_str = JSON.stringify(obj);
context.setVariable('json-header',header_str);
request.headers['x-json-hedar']= header_str;
I tested your code and it seems to work. Here's an example response where I set the header string as a response:
HTTP/1.1 200 OK
User-Agent: curl/7.30.0
Accept: */*
x-json-header: {"scope":"","time_till":2264,"id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM","custom_data":{"c_id":"test_data"}}
Content-Length: 0
It appears this is only an issue with the Apigee debug session / trace tool as the header value was set correctly. Here was the JSON download of the debug session showing this header value:
{
"name": "x-json-header",
"value": "{\"scope\":\"\",\"time_till\":2264,id_1\":\"hUXLXVqpA1J4vA9sayk2UttWNdM,\"custom_data\":{\"c_id\":\"test_data\"}}"
}
You can see that the value passed to the UI for displaying the debug info has the malformed json:
id_1\":\"hUXLXVqpA1J4vA9sayk2UttWNdM,
This does not appear to be a problem with the Apigee debug/trace UI. I see the malformed JSON trickle down to my backend service.
Here is the header I'm trying to send -
{"timeStamp":"2349218349381274","latitude":"34.589","longitude":"-37.343","clientIp":"127.0.0.0","deviceId":"MOBILE_TEST_DEVICE_AGAIN","macAddress":"23:45:345:345","deviceType":"phone","deviceOS":"iOS","deviceModel":"iPhone 5S","connection":"5G","carrier":"Vodafone","refererURL":"http://www.google.com","xforwardedFor":"129.0.0.0","sessionId":"kfkls498327ksdjf","application":"mobile-app","appVersion":"7.6.5","serviceVersion":"1.0","userAgent":"Gecko"}
But Apigee reads the header as below. Note the missing start quotes from some fields.
{"timeStamp":"2349218349381274",latitude":"34.589,longitude":"-37.343,clientIp":"127.0.0.0,deviceId":"MOBILE_TEST_DEVICE_AGAIN,macAddress":"23:45:345:345,deviceType":"phone,deviceOS":"iOS,deviceModel":"iPhone 5S,connection":"5G,carrier":"Vodafone,refererURL":"http://www.google.com,xforwardedFor":"129.0.0.0,sessionId":"kfkls498327ksdjf,application":"mobile-app,appVersion":"7.6.5,serviceVersion":"1.0,"userAgent":"Gecko"}
The header is used in a service callout to a backend service which parses it. And rightly so, I get the below error -
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('l' (code 108)): was expecting double-quote to start field name
at [Source: java.io.StringReader#22549cdc; line: 1, column: 35]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1378)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:599)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:520)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleUnusualFieldName(ReaderBasedJsonParser.java:1275)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._parseFieldName(ReaderBasedJsonParser.java:1170)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:611)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:301)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:121)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2796)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:1942)
I encounter strange behaviour when adding JSON to a context variable for example like the following:
var header_str = JSON.stringify(obj);
context.setVariable('json-header',header_str);
I appreciate this is an example so you may not have included the full extent of the problem but this normally works (now it is not added to a variable first):
request.headers['x-json-header'] = JSON.stringify(obj);
Code like this also works if you can send the request from JavaScript
var headers = {"Accept": "application/json", "Accept-Language": "en"};
var sessionRequest = new Request(url, 'POST', headers, body);
var exchange = httpClient.send(sessionRequest);
exchange.waitForComplete()
if (exchange.isSuccess()){
var responseObj = exchange.getResponse().content.asJSON;
if (responseObj.error){
request.content += JSON.stringify(responseObj);
}
}
Also, I have had success with using an AssignMessage policy to build a request, followed by a Callout policy to read the stored request and then make that request and store the result in a response object which can then be read by an Extract Variables policy.

How to safely handle raw (file) data in Java?

An image gets corrupted while being retrieved (through HTTP) and then sent (through HTTP) to a database. Image's raw data is handled in String form.
The service sends a GET for an image file, receives response with the raw image data (response's body) and the Content-Type. Then, a PUT request is sent with the aforementioned request's body and Content-Type header. (The PUT request is constructed by providing the body in String) This PUT request is sent to a RESTful database (CouchDB), creating an attachment (for those unfamiliar with CouchDB an attachment acts like a static file).
Now I have the original image, which my service GETs and PUTs to a database, and this 'copy' of the original image, that I can now GET from the database. If I then `curl --head -v "[copy's url]" it has the Content-Type of the original image, but Content-Length has changed, went from 200kb to about 400kb. If I GET the 'copy' image with a browser, it is not rendered, whereas, the original renders fine. It is corrupted.
What might be the cause? My guess is that while handling the raw data as a string, my framework guesses the encoding wrong and corrupts it. I have not been able to confirm or deny this. How could I handle this raw data/request body in a safe manner, or how could I properly handle the encoding (if that proves to be the problem)?
Details: Play2 Framework's HTTP client, Scala. Below a test to reproduce:
"able to copy an image" in {
def waitFor[T](future:Future[T]):T = { // to bypass futures
Await.result(future, Duration(10000, "millis"))
}
val originalImageUrl = "http://laughingsquid.com/wp-content/uploads/grumpy-cat.jpg"
val couchdbUrl = "http://admin:admin#localhost:5984/testdb"
val getOriginal:ws.Response = waitFor(WS.url(originalImageUrl).get)
getOriginal.status mustEqual 200
val rawImage:String = getOriginal.body
val originalContentType = getOriginal.header("Content-Type").get
// need an empty doc to have something to attach the attachment to
val emptyDocUrl = couchdbUrl + "/empty_doc"
val putEmptyDoc:ws.Response = waitFor(WS.url(emptyDocUrl).put("{}"))
putEmptyDoc.status mustEqual 201
//uploading an attachment will require the doc's revision
val emptyDocRev = (putEmptyDoc.json \ "rev").as[String]
// create actual attachment/static file
val attachmentUrl = emptyDocUrl + "/0"
val putAttachment:ws.Response = waitFor(WS.url(attachmentUrl)
.withHeaders(("If-Match", emptyDocRev), ("Content-Type", originalContentType))
.put(rawImage))
putAttachment.status mustEqual 201
// retrieve attachment
val getAttachment:ws.Response = waitFor(WS.url(attachmentUrl).get)
getAttachment.status mustEqual 200
val attachmentContentType = getAttachment.header("Content-Type").get
originalContentType mustEqual attachmentContentType
val originalAndCopyMatch = getOriginal.body == getAttachment.body
originalAndCopyMatch aka "original matches copy" must beTrue // << false
}
Fails at the last 'must':
[error] x able to copy an image
[error] original matches copy is false (ApplicationSpec.scala:112)
The conversion to String is definitely going to cause problems. You need to work with the bytes as Daniel mentioned.
Looking at the source it looks like ws.Response is just a wrapper. If you get to the underlying class then there are some methods that may help you. On the Java side, someone made a commit on GitHub to expose more ways of getting the response data other than a String.
I'm not familiar with scala but something like this may work:
getOriginal.getAHCResponse.getResponseBodyAsBytes
// instead of getOriginal.body
WS.scala
https://github.com/playframework/playframework/blob/master/framework/src/play/src/main/scala/play/api/libs/ws/WS.scala
WS.java
Here you can see that Response has some new methods, getBodyAsStream() and asByteArray.
https://github.com/playframework/playframework/blob/master/framework/src/play-java/src/main/java/play/libs/WS.java

Easy HTTP requests with gzip/deflate compression

I'm trying to figure out how the best way to easily send HTTP/HTTPS requests and to handle gzip/deflate compressed responses along with cookies.
The best I found was https://github.com/mikeal/request which handles everything except compression. Is there a module or method that will do everything I ask?
If not, can I combine request and zlib in some manner? I tried to combine zlib and http.ServerRequest, and it failed miserably.
For anyone coming across this in recent times, the request library supports gzip decompression out of the box now. Use as follows:
request(
{ method: 'GET'
, uri: 'http://www.google.com'
, gzip: true
}
, function (error, response, body) {
// body is the decompressed response body
console.log('server encoded the data as: ' + (response.headers['content-encoding'] || 'identity'))
console.log('the decoded data is: ' + body)
}
)
From the github readme https://github.com/request/request
gzip - If true, add an Accept-Encoding header to request compressed
content encodings from the server (if not already present) and decode
supported content encodings in the response. Note: Automatic decoding
of the response content is performed on the body data returned through
request (both through the request stream and passed to the callback
function) but is not performed on the response stream (available from
the response event) which is the unmodified http.IncomingMessage
object which may contain compressed data. See example below.
Note: as of 2019, request has gzip decompression built in. You can still decompress requests manually using the below method.
You can simply combine request and zlib with streams.
Here is an example assuming you have a server listening on port 8000 :
var request = require('request'), zlib = require('zlib');
var headers = {
'Accept-Encoding': 'gzip'
};
request({url:'http://localhost:8000/', 'headers': headers})
.pipe(zlib.createGunzip()) // unzip
.pipe(process.stdout); // do whatever you want with the stream
Here's a working example that gunzips the response
function gunzipJSON(response){
var gunzip = zlib.createGunzip();
var json = "";
gunzip.on('data', function(data){
json += data.toString();
});
gunzip.on('end', function(){
parseJSON(json);
});
response.pipe(gunzip);
}
Full code: https://gist.github.com/0xPr0xy/5002984
Check out the examples at http://nodejs.org/docs/v0.6.0/api/zlib.html#examples
zlib is now built into node.
Looking inside the source code - you must set the gzip param on the request lib itself for gzip to work. Not sure if this was intentional or not, but this is the current implementation. No extra headers are needed.
var request = require('request');
request.gzip = true;
request({url: 'https://...'}, // use encoding:null for buffer instead of UTF8
function(error, response, body) { ... }
);
All the answers here did not work and I was getting raw bytes back instead and the gzip flag was not working either. As it turns out you need to set the encoding to null to prevent requests from transforming the response to utf-8 encoding and instead keeps the binary response.
const request = require("request-promise-native");
const zlib = require("zlib");
const url = getURL("index.txt");
const dataByteBuffer = await request(url, { encoding: null });
const dataString = zlib.gunzipSync(response);

Resources