openresty on body_filter_by_lua send socket - nginx

I have a demand, is at server return response, send a request to other server, but openresty say API disabled in the context of body_filter_by_lua*. i use module resty.http.
thanks

You can change the main logic.
First issue subrequest to your upstream (location.capture or lua-resty-http)
Upon success you may first send the response downstream by Lua code and issue the next subrequest to your "other server" from Lua.
UPDATE - this doesn't work
As second approach you may treat your "other server" as upstream and allow request to this upstream only if subrequest to original server will be successful.
For both scenarios you may use access_by_lua* and content_by_lua* where cosocket API is available.

Related

what's mean "the request cannot be passed to the next server if nginx already started sending the request body"

http://nginx.org/en/docs/http/ngx_http_proxy_module.html#proxy_request_buffering
i can not understanding "When buffering is disabled, the request body is sent to the proxied server immediately as it is received. In this case, the request cannot be passed to the next server if nginx already started sending the request body."
what's mean "the request cannot be passed to the next server if nginx already started sending the request body"
If you have a service with multiple upstream servers, possibly for load balancing or resilience, and one of the upstream servers fail while Nginx is sending the request body to it, Nginx may try to use another server.
But it can only try another server if it has a complete copy of the original request body (i.e. request buffering enabled) or the client has not started sending the request body yet.

Dynamically redirect all requests with the same url path to the same upstream server

I'm trying to create a basic nginx LB that redirects requests to the same upstream server if the url path repeats itself without relying on the senders cookies/ip.
for example lets say I have an LB called A.com and a target called A.com/target.
The first time a request is sent to A.com/target the LB would redirect to a random server in the upstream, so far so good, the problem is that on the second request with the same url path (Doesn't matter who makes that request, therefor cookies are out of the question) I need the LB to redirect to the same server it redirected to last time.
I just cant get it to work properly so I wanted to ask if anybody has a proper way to do it with nginx or with any other simple LB.
You can balance by URL with the help of nginx upstream hash feature introduced in version 1.7.2
Official docs are here here.
upstream backend {
hash $scheme://$host$request_uri; # put any variables here
server backend1.example.com;
server backend2.example.com;
server backend3.example.com;
}
You might also consider adding "consistent" parameter at the end of the hash row in order to avoid massive rehashing in case of an upstream server changes- addition or removal. Haproxy is also an option with it's "hash-type consistent" parameter group for the backend section.

Nginx - Reacting to Upstream Response

I am using nginx as reverse proxy for file storage upload with an external provider.
When I am processing a file upload, I need to keep track (in my database) whether an upload was successful before returning the response to the user. I would therefore like to use the ngx.location.capture method provided in the lua-nginx-module to talk to my backend about the outcome of the request. Since I need to wait for the response of the upstream server I can only issue the capture in header_filter_by_lua. Unluckily I cannot issue any outwards communication in header_filter_by_lua. ngx.location.capture, ngx.socket.* and ngx.exec are only available when the response has not yet arrived.
How can I react to an upstream response in nginx?
Other approaches I've thought about:
Have a script watch the access log and then issue a curl request. (Seems like there should be an easier way)
Initially send the file via ngx.location.capture in content_by_lua (I don't think this would handle up to 5 GB filesize)
Help is appreciated :)
use for /upload location:
content_by_lua_file with resty.upload module

What is the best way to redirect network requests?

I've written my own HTTP Server, but given certain criteria, I want to redirect some requests made to my server to another server running on the same machine. For example, I may want to redirect all requests to "/foo/*" to be handled by an apache server I also have running. What is the best way to do this?
The only way I can think of doing this is by running apache on a different port, and then making a completely new network request from my server to localhost:1234 (assuming apache is running on port 1234) with the same exact request headers and body, and then take the response and have my server send that back to the client.
That seems like a kind of hacky, roundabout way of accomplishing this though, and I'm sure this is a problem that is tackled by every major website. Is there a certain technology or protocol for doing this that I just haven't heard of?
Thanks a lot!
Edit: Just to be clear, the client should only make one network request for all this, rather than having my server return a 3xx response
HTTP runs over TCP. The Apache server can't just send the required response to a client who hasn't asked for it. The client has asked YOUR HTTP server for the data and so it must be the one to send a response. The client is probably behind a firewall and, as such, the Apache server can't even establish a TCP connection with it (incoming connections are usually blocked).
If your server takes the clients request, forwards it to the Apache server, gets the response from the Apache server and forwards it to the client, it's acting as a proxy server (a middleman). This won't be redirection.
The only sensible way to do this would be to have the client make two network requests.

Use timeouts in a HTTP Server?

Should I use Timeouts in a HTTP Server implementation?
E.g. if I get a request and create a HTTP Connection to listen to requests with a separate Thread, should this thread use timeouts?
Currently I don't use Timeouts in Debug Code, only in Production code, as to find the lockups in the Server.
As long as you adheres the HTTP specification, I don't forsee problems.

Resources