Nginx Lua Scripting - nginx

One of the really cool things about Nginx is that you can take control of what it does by injecting Lua script at various phases of request processing. I have successfully used the rewrite_by_lua/file directive to examine the body of the incoming request and inject extra request headers for downstream processing by PHP. e.g.
location /api{
rewrite_by_lua_file "/path/to/rewrite.lua";
lua_need_request_body "on";
}
and then in rewrite.lua
local uri = ngx.var.request_uri;
//examine the URI and inject additional headers
ngx.req.set_header('headerName','headerValue');
What I can also do at this stage is inject response headers. For example
local cookieData = "cookieName=value;path=/;";
ngx.header['Set-Cookie'] = cookieData;
No issues thus far but this is not quite what I want to do. The workflow I have in mind goes like this
Examine the URI & inject extra request headers.
My PHP scripts examine the extra request headers, process the incoming data as required and may inject extra response headers
I then want to examine the response headers in another Nginx Lua script and inject cookies at that stage.
Injecting extra response headers via my PHP script is no problem at all. I thought that in order to examine those extra headers I would just need to setup
location /api {
header_filter_by_lua_file "path/to/header.lua";
}
with header.lua doing things like
local cookieData = "cookieName=value;path=/;";
ngx.header['Set-Cookie'] = cookieData;
The principle sounds perfect. However, I find that my header_filter_by_lua_file directives just get ignored - no errors reported in the Nginx log when I reload the configuration.
I must be doing something wrong here but I cannot see what it might be. I am using nginx 1.6.2 on Ubuntu 14.10 (x64). Nginx having been installed with apt-get install nginx-extras.
I'd be most grateful to anyone who might be able to explain how to get header_filter_by_* functioning correctly.

It's not a feature of vanilla nginx. You need to install openresty (instead of nginx).
See http://openresty.org/#Download and then http://openresty.org/#Installation

Related

Logging Response body in nginx without installing any other tool like Lua etc

I am working on a project which uses nginx as web server, nginx by default logs request body. I need to log response body data in json format (response is itself in json format) but I cannot use any supporting tool like Lua etc. My client is not permitting it.
I did it using openresty but that is not approved.
Is it possible to achieve in simple nginx installation?
If yes, then please help me.

Why is my custom header not present sometimes?

How do I get my custom header all the way to my Rails application when running behind nginx and Phusion Passenger? It is possible, please see details below, but when I just use the headers pane in Paw it is not passed through.
I am using Paw to test and develop some API endpoints in a Rails application. Everything works as expected in my development environment, which is a Rails 6 application running on macOS using Puma. For security, I use a custom header that contains a personal auth token. When I examine the Rails request object, specifically request.headers, I am able to see all the headers including my custom header and I can authenticate based on its value.
The problem comes when running on my staging system, where I have very little control of the environment. Here, the same Rails application is running under Phusion Passenger behind nginx. When I hit the same endpoint with the same request, just changing the host in the request, the custom header is not present. I verified this by writing all headers to a file for every request in staging.
Where has the header gone? Because the environment is different in staging, I suspect that nginx or Fusion Passenger is receiving the header but not passing it through to my Rails application. I can't verify this since I have no access to the logs other than my Rails application's logs. The application is designed to get requests from an external service, so I send some requests through that service and the header is present. That is very strange, so some headers are being passed through and some are not.
Paw (header defined under headers pane).
I checked with cURL using:
curl -X 'https://example.com/ivr/main_menu' -H 'X_JSW_AUTH_TOKEN':'my_tkn'
With Ruby's Net::HTTP:
uri = URI('https://example.com/ivr/main_menu')
http = Net::HTTP.new(uri.host, uri.port)
http.use_ssl = true
http.verify_mode = OpenSSL::SSL::VERIFY_PEER
req = Net::HTTP::Post.new(uri)
req.add_field "X_JSW_AUTH_TOKEN", "my_tkn"
res = http.request(req)
With HTTPie:
http POST 'https://example.com/ivr/main_menu' 'X_JSW_AUTH_TOKEN':'my_tkn'
With the http gem from httprb:
resp = HTTP.headers(X_JSW_AUTH_TOKEN: "my_tkn").post("https://example.com/ivr/main_menu")
It seems the answer has nothing to do with Paw. There's pretty solid evidence of this when cURL, HTTPie, and Net::HTTP all have the same results as Paw.
The problem is how nginx treats headers with underscores by default by ignoring them. See "Why do HTTP servers forbid underscores in HTTP header names" for more information.
Ruby's HTTP, which was able to provide the header to my application, did so because it replaces underscores ("_") with hyphens ("-") in header names, so these headers made it through to the Rails application. Rails then replaces hyphens in the header names of requests with underscores. So, both the sender (HTTP) and receiver (Rails) were making substitutions behind the scenes. This made it way harder to troubleshoot.
Many thanks to Chris Oliver for the answer.

Building URLs in Go including server scheme

I am creating a REST API in Go, and I want to build URLs to other resources in my replies.
Based on the http.Response I can get the Host and URL.
However, how would I go about getting the transport scheme used by the server? http or https?
I attemped to check if server.TLSConfig is nil and then assuming it is using http since it says this in the documentation for http.Server:
TLSConfig *tls.Config // optional TLS config, used by ListenAndServeTLS
But it turns out this exists even when I do not run the server with ListenAndServeTLS.
Or is this way of building my URLs the wrong way of doing things? Is there some other normal way of doing this?
My preferred solution when running http and https is just to run a simple listener on :80 that redirects all traffic to https. Then any real traffic can be assumed to be https.
Alternately I believe you can access a request's URL at req.URL.Scheme to see the protocol.
Or do you mean for the entire application? If you accept configuration to switch between http and https, then can't you look at that and see which they chose? I guess I'm missing some context maybe.
It is also common practice for apps to take a baseURL via flag or config to generate external urls with.

How do I make meteor server HTTP.call through proxy?

My meteor server will fetch data from another source on Internet. The request has to go via a proxy. How can I specify the proxy server for server-side HTTP.call's?
You could easily make all HTTP.* calls through a proxy if only Meteor developers accepted my pull request to pass through options like proxy to the request module, on which the HTTP package is based.
Please comment on this GitHub issue to ask for that.
UPDATE: Since the Meteor devs refused to implement that change, I published an Atmosphere package that lets you transmit to Node (i.e. to the request module) any options you want.
Check out http-more on Atmosphere.
Found a solution for my problem.
I'm using Windows and could not find a way to set a default proxy for the OS as Serkan mentioned. Setting proxy server in Internet Explorer internet options LAN settings did not work. Settings proxy in winHTTP did not work. Anyone else know how to do it?
The most reasonable would be that Node read a environment variable and used that. So, I created an environment variable "HTTP_PROXY" and to see if node would read it I tried:
D:\Appl\.meteor\tools\a5dc07c9ab\bin>node -e "console.log(process.env.http_proxy)"
and it did output my variable. But, when trying to make a http.get() request directly within Node it failed. Node is obviously not using that variable ...
The conclusion of that is that I have to explicitly set the proxy in my app, but that is not possible with Meteor HTTP. Instead I could use the request module (that Meteor HTTP is using) and set the proxy. Not the ideal solution, because my app has to know about the proxy, but ok for my purpose.
if (Meteor.isServer) {
var request = Npm.require("request");
var makeRequest = Meteor._wrapAsync(thirdLibMakeRequest);
function thirdLibMakeRequest(options, callback) {
options.proxy = "http://myProxyServer:8080";
request(options, callback);
};
var response = makeRequest({ url: "http://UrlToSomeSite" });
}
Include the request module
Wrap the 3rd-lib async method so we can use it in Meteor
set the proxy property of the request module
use makeRequest to make requests.
Since the platform your meteor app will be running on will be behind the proxy as a whole, you'll be needing proxy access generally anyway.
Therefore, you can set your platform (os) up to connect to the proxy server by default, therefore Meteor will not necessarily know/care about the presence of a proxy since it will be transparent to it.

Tamper with first line of URL request, in Firefox

I want to change first line of the HTTP header of my request, modifying the method and/or URL.
The (excellent) Tamperdata firefox plugin allows a developer to modify the headers of a request, but not the URL itself. This latter part is what I want to be able to do.
So something like...
GET http://foo.com/?foo=foo HTTP/1.1
... could become ...
GET http://bar.com/?bar=bar HTTP/1.1
For context, I need to tamper with (make correct) an erroneous request from Flash, to see if an error can be corrected by fixing the url.
Any ideas? Sounds like something that may need to be done on a proxy level. In which case, suggestions?
Check out Charles Proxy (multiplatform) and/or Fiddler2 (Windows only) for more client-side solutions - both of these run as a proxy and can modify requests before they get sent out to the server.
If you have access to the webserver and it's running Apache, you can set up some rewrite rules that will modify the URL before it gets processed by the main HTTP engine.
For those coming to this page from a search engine, I would also recommend the Burp Proxy suite: http://www.portswigger.net/burp/proxy.html
Although more specifically targeted towards security testing, it's still an invaluable tool.
If you're trying to intercept the HTTP packets and modify them on the way out, then Tamperdata may be route you want to take.
However, if you want minute control over these things, you'd be much better off simulating the entire browser session using a utility such as curl
Curl: http://curl.haxx.se/

Resources