Easy HTTP requests with gzip/deflate compression - http

I'm trying to figure out how the best way to easily send HTTP/HTTPS requests and to handle gzip/deflate compressed responses along with cookies.
The best I found was https://github.com/mikeal/request which handles everything except compression. Is there a module or method that will do everything I ask?
If not, can I combine request and zlib in some manner? I tried to combine zlib and http.ServerRequest, and it failed miserably.

For anyone coming across this in recent times, the request library supports gzip decompression out of the box now. Use as follows:
request(
{ method: 'GET'
, uri: 'http://www.google.com'
, gzip: true
}
, function (error, response, body) {
// body is the decompressed response body
console.log('server encoded the data as: ' + (response.headers['content-encoding'] || 'identity'))
console.log('the decoded data is: ' + body)
}
)
From the github readme https://github.com/request/request
gzip - If true, add an Accept-Encoding header to request compressed
content encodings from the server (if not already present) and decode
supported content encodings in the response. Note: Automatic decoding
of the response content is performed on the body data returned through
request (both through the request stream and passed to the callback
function) but is not performed on the response stream (available from
the response event) which is the unmodified http.IncomingMessage
object which may contain compressed data. See example below.

Note: as of 2019, request has gzip decompression built in. You can still decompress requests manually using the below method.
You can simply combine request and zlib with streams.
Here is an example assuming you have a server listening on port 8000 :
var request = require('request'), zlib = require('zlib');
var headers = {
'Accept-Encoding': 'gzip'
};
request({url:'http://localhost:8000/', 'headers': headers})
.pipe(zlib.createGunzip()) // unzip
.pipe(process.stdout); // do whatever you want with the stream

Here's a working example that gunzips the response
function gunzipJSON(response){
var gunzip = zlib.createGunzip();
var json = "";
gunzip.on('data', function(data){
json += data.toString();
});
gunzip.on('end', function(){
parseJSON(json);
});
response.pipe(gunzip);
}
Full code: https://gist.github.com/0xPr0xy/5002984

Check out the examples at http://nodejs.org/docs/v0.6.0/api/zlib.html#examples
zlib is now built into node.

Looking inside the source code - you must set the gzip param on the request lib itself for gzip to work. Not sure if this was intentional or not, but this is the current implementation. No extra headers are needed.
var request = require('request');
request.gzip = true;
request({url: 'https://...'}, // use encoding:null for buffer instead of UTF8
function(error, response, body) { ... }
);

All the answers here did not work and I was getting raw bytes back instead and the gzip flag was not working either. As it turns out you need to set the encoding to null to prevent requests from transforming the response to utf-8 encoding and instead keeps the binary response.
const request = require("request-promise-native");
const zlib = require("zlib");
const url = getURL("index.txt");
const dataByteBuffer = await request(url, { encoding: null });
const dataString = zlib.gunzipSync(response);

Related

Returning NextResponse with modified responses and modified NextRequest

As per the changelists, it's now possible to modify requests in middleware via:
const headers = new Headers(request.headers);
// Add a new request header
headers.set('x-hello-from-middleware', 'foo');
// Delete a request header from the client
headers.delete('x-from-client');
const resp = NextResponse.next({
// New option `request.headers` which accepts a Headers object
// overrides request headers with the specified new ones.
request: {
headers
}
});
This works fine.
It's also possible to add to to the response, such as adding cookies, via:
let response = NextResponse.next();
response.cookies.set({
name: 'access_token',
value: newAccessToken,
expires: expiresString,
path: '/',
});
return response
This also works fine.
Question
I'm having trouble understanding the technique to combine both.
If I modify the request, for returning. Ideally, I would create the headers object, as in the first example, and then add it to the response object in the second example, before returning response. Seems straightforward. Response is an object.
But I specify:
response.request = headers
or
attempt to make an object or response parameters and and add them alongside header parameters like:
const resp = NextResponse.next({
response: responseParams
request: {
headers
}
});
Neither work. I'm missing something obvious I can't find in the documentation. Any idea?
I didn't find a way to directly set response.request headers after declaring response (perhaps it's not possible) but I did overlook I could change the order I was doing things in and missed the obvious that I could also declare a new NextResponse with request variables as its own variable to add cookies to.
e.g.
let response = NextResponse.next({
request: {
headers: requestHeaders,
}
});
response.cookies.set({
name: 'access_token',
value: newAccessToken,
expires: expiresString,
path: '/',
});
return response
This solved the problem.

'Access-Control-Allow-Origin' missing using actix-web

Stuck on this problem where I received this error everytime making POST request to my actix-web server.
CORS header 'Access-Control-Allow-Origin' missing
my javascript (VueJs running on localhost:3000) :
let data = //some json data
let xhr = new XMLHttpRequest();
xhr.open("POST", "http://localhost:8080/abc");
xhr.setRequestHeader("Content-Type", "application/json");
xhr.onload = () => {
console.log(xhr.responseText);
}
xhr.send(JSON.stringify(data));
My Actix_Web server (running on localhost:8080) :
#[actix_web::main]
async fn main() {
HttpServer::new(move || {
let cors = Cors::default()
.allowed_origin("http://localhost:3000/")
.allowed_methods(vec!["GET", "POST"])
.allowed_header(actix_web::http::header::ACCEPT)
.allowed_header(actix_web::http::header::CONTENT_TYPE)
.max_age(3600);
App::new()
.wrap(cors)
.service(myfunc)
})
.bind(("0.0.0.0", 8080))
.unwrap()
.run()
.await
.unwrap();
}
my cargo.toml dependencies
[dependencies]
actix-web = "4"
actix-cors = "0.6.1"
...
Got any idea?
Okay, so I've done some testing. If you're writing a public API, you probably want to allow all origins. For that you may use the following code:
HttpServer::new(|| {
let cors = Cors::default().allow_any_origin().send_wildcard();
App::new().wrap(cors).service(greet)
})
If you're not writing a public API... well, I'm not sure what they want you to do. I've not figured out how to tell the library to send that header. I guess I will look at the code.
UPDATE:
So funny story, this is how you allow specific origins:
let cors = Cors::default()
.allowed_origin("localhost:3000")
.allowed_origin("localhost:2020");
BUT, and oh boy, is that but juicy. The Access-Control-Allow-Origin response header is only set when there is a Origin request header. That header is normally added by the browser in certain cases 1. So I did that (using the Developer tools in the browser). What did I get? "Origin is not allowed to make this request". I set my origin header to localhost:3000. Turns out, the arctix library simply discards that header if no protocol was provided... (e.g. http://) (I assume it discards it, if it deems its format invalid). That internally results in the header being the string "null". Which is, checks notes, not in the list of allowed origins.
And now the grand finale:
Your origin header needs to be set to (by either you or the browser): "http://localhost:3000".
Your configuration needs to include: .allowed_origin("http://localhost:3000").
After doing that, the server will happily echo back your origin header in the Access-Control-Allow-Origin header. And it will only send that one.
I've no idea if any of that is what the standard specifies (or not). I encourage you to read through it, and if it doesn't comply, please open an issue on GitHub. I would do it myself, but I'm done with programming for today.
Cheers!

File upload : who is responsible for setting HTTP headers

I'm trying to understand how HTTP file uploads work.
For instance, my VueJS app is calling a REST API (with Axios). When calling axios.request, no headers are set. There is just a FormData object containing the file to upload.
When the request arrives to the backend, I see that a Content-Type: multipart/form-data; ... header has been added to the request.
At which moment is this header created? Who is responsible for creating the header?
If it is a file upload Ajax request, in Axios, it's the browser that set the Content-Type: multipart/form-data;... header.
In Axios source code lib/adapters/xhr.js (the one that take charge of XMLHttpRequest), the HTTP request data will be checked. If it is an instance of FormData, then Content-Type header would be deleted and let browser do the job.
In lib/adapters/xhr.js (look at the comment in the source code):
if (utils.isFormData(requestData)) {
delete requestHeaders['Content-Type']; // Let the browser set it
}
For utils.isFormData(), the logic is:
// code in lib/utils.js
function isFormData(val) {
return (typeof FormData !== 'undefined') && (val instanceof FormData);
}

Send request with body with openresty lua-resty-http module

I am trying to send request via http module lua-resty-http. How I can send request with body data.
I have tried this
hc:connect("127.0.0.1", 82)
dates = ngx.req.get_post_args()
local hc = http:new()
result, errors = hc:request{
path = requrl,
method = "POST",
body = dates,
headers = {
["Host"] = "localhost",
},
}
Basically I am trying to send a lua table to another server location. And how to capture on that lua table location.
I'd appreciate a detailed explanation.
ngx.req.get_post_args() returns a table of key, value pairs. The body argument for the http client's request function must be in a format supported by OpenResty's cosocket send API. This means either a string, or array like table holding strings.
If you want to send a lua table with an HTTP request then you'll need a way to encode it to a string. A common approach is using JSON, and you can do this with the bundled cjson library:
local json = require "cjson"
local dates = ngx.req.get_post_args()
hc:request {
body = json.encode(dates),
...
}

How do I inject new request header with json data in proxy request flow?

I am trying to inject a new request header in the proxy request flow using JS policy to be sent to the backend server. When I look at the debug trace, I see that the json data in the request header is distorted.
I am trying to inject some string like
{"scope":"","time_till":2264,"id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM","custom_data":{"c_id":"test_data"}}
But when I look at the trace window I see this
{"scope":"","time_till":2264,id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM,"custom_data":{"c_id":"test_data"}}
what am I doing wrong?
var obj = {"scope":"","time_till":2264,"id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM","custom_data":{"c_id":"test_data"}};
var header_str = JSON.stringify(obj);
context.setVariable('json-header',header_str);
request.headers['x-json-hedar']= header_str;
I tested your code and it seems to work. Here's an example response where I set the header string as a response:
HTTP/1.1 200 OK
User-Agent: curl/7.30.0
Accept: */*
x-json-header: {"scope":"","time_till":2264,"id_1":"hUXLXVqpA1J4vA9sayk2UttWNdM","custom_data":{"c_id":"test_data"}}
Content-Length: 0
It appears this is only an issue with the Apigee debug session / trace tool as the header value was set correctly. Here was the JSON download of the debug session showing this header value:
{
"name": "x-json-header",
"value": "{\"scope\":\"\",\"time_till\":2264,id_1\":\"hUXLXVqpA1J4vA9sayk2UttWNdM,\"custom_data\":{\"c_id\":\"test_data\"}}"
}
You can see that the value passed to the UI for displaying the debug info has the malformed json:
id_1\":\"hUXLXVqpA1J4vA9sayk2UttWNdM,
This does not appear to be a problem with the Apigee debug/trace UI. I see the malformed JSON trickle down to my backend service.
Here is the header I'm trying to send -
{"timeStamp":"2349218349381274","latitude":"34.589","longitude":"-37.343","clientIp":"127.0.0.0","deviceId":"MOBILE_TEST_DEVICE_AGAIN","macAddress":"23:45:345:345","deviceType":"phone","deviceOS":"iOS","deviceModel":"iPhone 5S","connection":"5G","carrier":"Vodafone","refererURL":"http://www.google.com","xforwardedFor":"129.0.0.0","sessionId":"kfkls498327ksdjf","application":"mobile-app","appVersion":"7.6.5","serviceVersion":"1.0","userAgent":"Gecko"}
But Apigee reads the header as below. Note the missing start quotes from some fields.
{"timeStamp":"2349218349381274",latitude":"34.589,longitude":"-37.343,clientIp":"127.0.0.0,deviceId":"MOBILE_TEST_DEVICE_AGAIN,macAddress":"23:45:345:345,deviceType":"phone,deviceOS":"iOS,deviceModel":"iPhone 5S,connection":"5G,carrier":"Vodafone,refererURL":"http://www.google.com,xforwardedFor":"129.0.0.0,sessionId":"kfkls498327ksdjf,application":"mobile-app,appVersion":"7.6.5,serviceVersion":"1.0,"userAgent":"Gecko"}
The header is used in a service callout to a backend service which parses it. And rightly so, I get the below error -
com.fasterxml.jackson.core.JsonParseException: Unexpected character ('l' (code 108)): was expecting double-quote to start field name
at [Source: java.io.StringReader#22549cdc; line: 1, column: 35]
at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1378)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportError(ParserMinimalBase.java:599)
at com.fasterxml.jackson.core.base.ParserMinimalBase._reportUnexpectedChar(ParserMinimalBase.java:520)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._handleUnusualFieldName(ReaderBasedJsonParser.java:1275)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._parseFieldName(ReaderBasedJsonParser.java:1170)
at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:611)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:301)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:121)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:2796)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:1942)
I encounter strange behaviour when adding JSON to a context variable for example like the following:
var header_str = JSON.stringify(obj);
context.setVariable('json-header',header_str);
I appreciate this is an example so you may not have included the full extent of the problem but this normally works (now it is not added to a variable first):
request.headers['x-json-header'] = JSON.stringify(obj);
Code like this also works if you can send the request from JavaScript
var headers = {"Accept": "application/json", "Accept-Language": "en"};
var sessionRequest = new Request(url, 'POST', headers, body);
var exchange = httpClient.send(sessionRequest);
exchange.waitForComplete()
if (exchange.isSuccess()){
var responseObj = exchange.getResponse().content.asJSON;
if (responseObj.error){
request.content += JSON.stringify(responseObj);
}
}
Also, I have had success with using an AssignMessage policy to build a request, followed by a Callout policy to read the stored request and then make that request and store the result in a response object which can then be read by an Extract Variables policy.

Resources