I write a Nginx module, the client will post stream data to it. But I found the client will be blocked after send about 64KB data(http body's size). I try to change the Nginx config client_body_buffer_size, after that the client can send more data.
r->request_body_no_buffering = 1; ngx_http_read_client_request_body(r, body_handler);
Do I have to do something in the body_handler function which release the request_body->buffs?
I try to use ngx_free_chain, but it doesn't work.
I am using pdfkit to generate pdf at runtime and returning this in http response for download. I am able to download the file at browser end but the download dialog is not opening immediately. Instead its waiting till doc.end is called. I guess pdfkit is unable to push the stream efficiently. Has anybody else faced this? If yes, please guide.
Here is the sample code which I am trying
exports.testPdfKit = functions.https.onRequest((request, response) => {
//create pdf document
doc.pipe(response);
response.set('Content-Disposition', `attachment;filename=testpdfstream.pdf`);
response.writeHead(200, { 'Content-Type': 'application/pdf' })
const bigText = "some big text"
for (var i = 0; i < 1000; i++) {
console.log('inside iteration -',i)
doc.text(bigText);
doc.addPage();
}
doc.end()
});
I am implementing this functionality on firebase functions which uses expressjs internally for processing http requests. To generate bigger files at my end, streaming is must for me.
HTTP functions can not stream the input or output of the function. The entire request is delivered in one chunk of memory, and the response is collection and send back to the client in one chunk. The maximum size of both is 10MB. There are not workarounds for this limitation of Cloud Functions (but it does help you system scale better).
If you need streaming or websockets, you'll need to use a different product, such as app engine or compute engine.
I'm new to gRPC programming. I need to write a gRPC client which receives file content sent from the gRPC server and send this content to web-page. As the file content is huge, I've set it as a stream.
Below is my .proto file
service LogService {
rpc fetchLogContent(LogRequest) returns (stream LogResponse);
}
message LogRequest {
string ip = 1;
string fileName = 2;
}
message LogResponse {
string ip = 1;
string logContent = 2;
}
Now, in the client, when I use the blockingStub to access fetchLogContent, it returns an Iterator . I understand from the examples provided grpc.io - java, if there are list of response objects, (list of Feature objects in the example provided in link), an Iterator will be valid. But in my case, I need a single LogResponse which was sent as a stream. Please provide any suggestions/alternatives for this. Thanks in advance. :)
In your Method definition:
rpc fetchLogContent(LogRequest) returns (stream LogResponse);
The stream keyword means that you will get 0 or more LogResponse messages. If the content is very large, what you can do is read chunks of the file (like 4kb) and send multiple LogResponse messages each with a part of the file. The client side can read the chunks repeatedly, and piece it back together.
Since the ip field will likely not be changing each time, you can make you server only set ip on the first message. On the client, just use the very first ip received.
I would like to write an HTTP server that answer to request using a non-standard HTTP method (verb). For instance, the client would make a request like FOO / HTTP/.1.1. And on the server side, this request would be handled by something like:
var express = require('express');
var app = express.createServer();
app.configure(function(){
app.use(express.logger({ format: ':method :url' }));
app.use(express.methodOverride());
});
app.foo('/', function(req, res){
res.send('Hello World');
});
app.listen(3000);
I appended my non-standard method to the array exported in ExpressJS's lib/router/methods.js. This allow me to write my server code as expected. When using express.methodOverride() and a POST request with _method=foo, it works. But an actual FOO request doesn't work. As soon as the client send the first line of the request the connection is closed by the server:
$telnet localhost 3000
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
FOO / HTTP/1.1
Connection closed by foreign host.
I would like to be able to implement this with ExpressJS and without avoid hacking into its core file.
Any idea if this is possible and how?
Short answer: No, it's not possible. Not without implementing your own HTTP module.
To test, start a barebones HTTP server ...
$ node
> require('http').createServer(function(req, res) {
... console.log(req.method);
... res.end();
... }).listen(8080);
Then (as you've already done) telnet to it and issue a GET and FOO request ...
$ telnet localhost 8080
Trying ::1...
telnet: connect to address ::1: Connection refused
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
GET / HTTP/1.1
HTTP/1.1 200 OK
Connection: keep-alive
Transfer-Encoding: chunked
0
FOO / HTTP/1.1
Connection closed by foreign host.
$
In node console you'll see
GET
... but no FOO. So, node's native HTTP module, which Express uses, does not make these requests available.
Node has a hard-coded whitelist of acceptable HTTP verbs in C.
In order to accept custom verbs, you must modify the HTTP parser and recompile node.
You mentioned that you're trying to implement PURGE, which was added to the whitelist in v0.7.5.
As others have said, Node.js' HTTP server library is configured to accept only specific verbs. Ben Noordius' suggestion of using Parsley doesn't work either, since that library accepts an even smaller whitelist of verbs. (It also hasn't been maintained in quite some time.)
At this stage, if we want to support oddball requests, we have to take more drastic measures. Here's a nice ugly hack for you that involves duck punching some internal behavior. This works on v0.10.x of Node.js, but test carefully on newer versions as they become available.
In my case, I needed to support not only a non-standard verb, but a non-standard protocol version identifier as well, and a missing Content-Length header for Icecast source streams:
SOURCE /live ICE/1.0
The following should get you started:
server.on('connection', function (socket) {
var originalOnDataFunction = socket.ondata;
var newLineOffset;
var receiveBuffer = new Buffer(0);
socket.ondata = function (d, start, end) {
receiveBuffer = Buffer.concat([receiveBuffer, d.slice(start, end)]);
if ((newLineOffset = receiveBuffer.toString('ascii').indexOf('\n')) > -1) {
var firstLineParts = receiveBuffer.slice(0, newLineOffset).toString().split(' ');
firstLineParts[0] = firstLineParts[0].replace(/^SOURCE$/ig, 'PUT');
firstLineParts[2] = firstLineParts[2].replace(/^ICE\//ig, 'HTTP/');
receiveBuffer = Buffer.concat([
new Buffer(
firstLineParts.join(' ') + '\r\n' +
'Content-Length: 9007199254740992\r\n'
),
receiveBuffer.slice(newLineOffset +1)
]);
socket.ondata = originalOnDataFunction;
socket.ondata.apply(this, [receiveBuffer, 0, receiveBuffer.length]);
}
};
}
It's ugly, but works. I'm not particularly happy about it, but when choosing between a rough built-from-the-ground-up HTTP parser or tweaking an existing one, I choose to tweak in this instance.
For anyone who needs it, there is http-parser-js, which replaces Node's built-in HTTP parser.
Their README contains an example of monkey-patching the parser, though I find that it wasn't enough, as both the http-parser-js and the http modules have a hardcoded list of methods.
So, you have to replace the parser and edit the list of methods:
const { HTTPParser } = require('http-parser-js');
HTTPParser.methods.push('FOOBAR');
const binding = process.binding('http_parser');
binding.HTTPParser = HTTPParser;
binding.methods = HTTPParser.methods;
require('http').METHODS = HTTPParser.methods;
Later Node versions may not support process.binding, in which case, you can use the --expose-internals flag for Node (see this issue):
const { internalBinding } = require('internal/test/binding');
const binding = internalBinding('http_parser');
From the looks of it, the http2 module's parser accepts any method, in case that's an option. See this issue about invalid HTTP methods. Unfortunately, express and the likes do not use http2.
And for anyone who was in my shoes, proxying requests to a legacy server in Create React App, use the above snippet in webpack-dev-server, at the top of Server.js, in order to monkey-patch the parser. Hopefully everything switches to http2 soon...
I need to check if a file exists on a server using delphi..
The idea is to send a request to the server (ex : http://www.example.com/file.txt) and check the status code of the response..
how is it done in delphi?
You can use the TIdHTTP class (included in Delphi). Simply create an instance at run time and use its Head method to retrieve information about the server resource.
MyIdHTTP.Head(TheURL);
ResponseCode := MyIdHTTP.Response.ResponseCode; // 200 = OK etc
ContentLength := MyIdHTTP.Response.ContentLength;
Note that it will not download the whole resource, and the value in ContentLength is not guaranteed ok (for example for dynamically created resources)
hyper(abstract) text transfer protocol.
you can use ftp protocol for this purpose