Is there a way to intercept a bad HEAD request in a Go HTTP server? A bad request here would be to send a JSON payload with a HEAD request. I call this a Bad Request, but when I attempt a HEAD request with a body via curl, I get this error. However, no logging occurs in Go.
package main
import (
"fmt"
"log"
"net/http"
)
func handler(w http.ResponseWriter, r *http.Request) {
log.Println(r.Method, r.URL)
_, _ = fmt.Fprintf(w, "Hello")
}
func main() {
http.HandleFunc("/", handler)
log.Fatal(http.ListenAndServe(":8080", nil))
}
If I send a curl request without a body, it works as expected and a log entry is generated 2019/11/28 10:58:59 HEAD / .
$ curl -v -X HEAD http://localhost:8080
curl -i -X HEAD http://localhost:8080
Warning: Setting custom HTTP method to HEAD with -X/--request may not work the
Warning: way you want. Consider using -I/--head instead.
HTTP/1.1 200 OK
Date: Thu, 28 Nov 2019 16:03:22 GMT
Content-Length: 5
Content-Type: text/plain; charset=utf-8
However, if I send a curl request with a body, then I get a Bad Request status but no log is updated.
$ curl -i -X HEAD http://localhost:8080 -d '{}'
Warning: Setting custom HTTP method to HEAD with -X/--request may not work the
Warning: way you want. Consider using -I/--head instead.
HTTP/1.1 400 Bad Request
Content-Type: text/plain; charset=utf-8
Connection: close
400 Bad Request
I want to catch this error so I can send my own custom error message back. How can I intercept this?
You can't. The HTTP server of the standard lib does not provide any interception point or callback for this case.
The invalid request is "killed" before your handler would be called. You can see this in server.go, conn.serve() method:
w, err := c.readRequest(ctx)
// ...
if err != nil {
switch {
// ...
default:
publicErr := "400 Bad Request"
if v, ok := err.(badRequestError); ok {
publicErr = publicErr + ": " + string(v)
}
fmt.Fprintf(c.rwc, "HTTP/1.1 "+publicErr+errorHeaders+publicErr)
return
}
}
// ...
serverHandler{c.server}.ServeHTTP(w, w.req)
Go's HTTP server provides you an implementation to handle incoming requests from clients that use / adhere to the HTTP protocol. All browsers and notable clients follow the HTTP protocol. It's not the implementation's goal to provide a fully customizable server.
Related
I have a strange situation. I want to return the content type application/json; charset=utf-8 from an http handler.
func handleTest() http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
if r.Header.Get("Accept") != "application/json" {
w.WriteHeader(http.StatusNotAcceptable)
return
}
w.WriteHeader(http.StatusOK)
w.Header().Set("Content-Type", "application/json; charset=utf-8")
json.NewEncoder(w).Encode(map[string]string{"foo": "bar"})
}
}
When I check for this in my unit tests it is correct. This test does not fail.
func TestTestHandler(t *testing.T) {
request, _ := http.NewRequest(http.MethodGet, "/test", nil)
request.Header.Set("Accept", "application/json")
response := httptest.NewRecorder()
handleTest().ServeHTTP(response, request)
contentType := response.Header().Get("Content-Type")
if contentType != "application/json; charset=utf-8" {
t.Errorf("Expected Content-Type to be application/json; charset=utf-8, got %s", contentType)
return
}
}
But when I try with curl (and other clients) it comes out as text/plain; charset=utf-8.
$ curl -H 'Accept: application/json' localhost:8080/test -v
* Trying 127.0.0.1:8080...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 8080 (#0)
> GET /test HTTP/1.1
> Host: localhost:8080
> User-Agent: curl/7.68.0
> Accept: application/json
>
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Date: Tue, 28 Dec 2021 13:02:27 GMT
< Content-Length: 14
< Content-Type: text/plain; charset=utf-8
<
{"foo":"bar"}
* Connection #0 to host localhost left intact
I have tried this with curl, insomnia and python. In all 3 cases the content type came out as text/plain; charset=utf-8.
What is causing this problem and how can I fix it?
From the http package docs:
WriteHeader sends an HTTP response header with the provided status code.
and
Changing the header map after a call to WriteHeader (or Write) has no effect unless the modified headers are trailers.
So you are setting the "Content-Type" header after the header has already been sent out to the client. While mocking this likely works because the buffer where the headers are stored can be modified after the WriteHeader call. But when actually using a TCP connection you can't do this.
So simply move your w.WriteHeader(http.StatusOK) so it happens after the w.Header().Set(...)
Can't understand what is wrong. ioutil.ReadAll should use gzip as for other URLs.
Can reproduce with URL: romboutskorea.co.kr
Error:
gzip: invalid header
Code:
resp, err := http.Get("http://" + url)
if err == nil {
defer resp.Body.Close()
if resp.StatusCode == http.StatusOK {
fmt.Printf("HTTP Response Status : %v\n", resp.StatusCode)
bodyBytes, err := ioutil.ReadAll(resp.Body)
if err != nil {
fmt.Printf("HTTP Response Read error. Url: %v\n", url)
log.Fatal(err)
}
bodyString := string(bodyBytes)
fmt.Printf("HTTP Response Content Length : %v\n", len(bodyString))
}
}
The response of this site is wrong. It is claiming gzip encoding but it does not actually compress the content. The response looks something like this:
HTTP/1.1 200 OK
...
Content-Encoding: gzip
...
Transfer-Encoding: chunked
Content-Type: text/html; charset=euc-kr
8000
<html>
<head>
...
The "8000" comes from the chunked transfer encoding but the "..." is the beginning of the unchunked response body. Obviously this is not compressed even though it is claimed so.
It looks like browsers simply work around this broken site by ignoring the wrong encoding specification. Browsers actually work around lot of broken stuff which does not really add motivation for the providers to fix these issues :( But you can see that curl will fail to:
$ curl -v --compressed http://romboutskorea.co.kr/main/index.php?
...
< HTTP/1.1 200 OK
< ...
< Content-Encoding: gzip
< ...
< Transfer-Encoding: chunked
< Content-Type: text/html; charset=euc-kr
<
* Error while processing content unencoding: invalid code lengths set
* Failed writing data
* Curl_http_done: called premature == 1
* Closing connection 0
curl: (23) Error while processing content unencoding: invalid code lengths set
And so does Python:
$ python3 -c 'import requests; requests.get("http://romboutskorea.co.kr/main/index.php?")'
...
requests.exceptions.ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing data: incorrect header check'))
I see
Content-Type: text/html; charset=euc-kr
Content-Encoding: gzip
Check the Body content: as in here, it could be an HTTP response where the body is first compressed with gzip and then encoded with chunked transfer encoding.
An NewChunkedReader would be needed, as in this example.
I had a similar issue, but I was dealing with a "hand-crafted" PHP script response which did something like this:
header('Content-Encoding: gzip');
echo #gzcompress($return);
I was trying to read the response from GO with:
gzip.NewReader(resp.Body)
But I should be doing:
zlib.NewReader(resp.Body)
From gzcompress PHP docs:
https://www.php.net/manual/en/function.gzcompress.php
'This function compresses the given string using the ZLIB data format.'
'This is not the same as gzip compression, which includes some header data. See gzencode() for gzip compression.'
I have some Delphi code that connects to a servlet and I´m trying to switch from TIdTCPClient to TIdHTTP.
I connect to the servlet this way
try
lHTTP := TIdHTTP.Create( nil );
responseStream := TMemoryStream.Create;
lHTTP.Get(HttpMsg, responseStream);
SetString( html, PAnsiChar(responseStream.Memory), responseStream.Size);
AnotarMensaje( odDepurar, 'IMPFIS: Impresora fiscal reservada ' + html );
Where HttpMsg is localhost:6080/QRSRPServer/PedirImpresion?usuarioDMS=hector
All I´m getting is
GET localhost:6080/QRSRPServer/PedirImpresion?usuarioDMS=hector HTTP/1.1
Content-Type: text/html
Accept: text/html, */*
User-Agent: Mozilla/3.0 (compatible; Indy Library)
HTTP/1.1 400 Bad Request
The HTTP dialog that I had before was like this
GET /QRSRPServer/PedirImpresion?usuarioDMS=hector HTTP/1.1
Host: localhost:6080
HTTP/1.1 200 OK
So, I try to add the Host header, with this host: localhost:6080
try
lHTTP := TIdHTTP.Create( nil );
lHTTP.Host := Host;
responseStream := TMemoryStream.Create;
lHTTP.Get(HttpMsg, responseStream);
SetString( html, PAnsiChar(responseStream.Memory), responseStream.Size);
AnotarMensaje( odDepurar, 'IMPFIS: Impresora fiscal reservada ' + html );
And I get
Socket Error # 11004
Where HttpMsg is localhost:6080/QRSRPServer/PedirImpresion?usuarioDMS=hector
HttpMsg must begin with http:// or https://:
http://localhost:6080/QRSRPServer/PedirImpresion?usuarioDMS=hector
You should be getting an EIdUnknownProtocol exception raised when TIdHTTP parses the URL and sees the missing protocol scheme.
TIdHTTP should always be sending a Host header, but especially for an HTTP 1.1 request, but you claim it is not. This is why you are getting a Bad Request error, because HTTP 1.1 servers are required to reject an HTTP 1.1 request that omits that header.
You also claim that TIdHTTP is including the host and port values in the GET line. The ONLY time it ever does that is when connecting to a host through an HTTP proxy, but I don't see you configuring the TIdHTTP.ProxyParams property at all.
In short, TIdHTTP should not be behaving the way you claim.
The correct solution is to make sure you are passing a full URL to TIdHTTP.Get().
On a side note, your code requires html to be an AnsiString. You should change it to a standard string (which is AnsiString in D2007 and earlier) and let TIdHTTP return a string for you, then you don't need the TMemoryStream anymore:
html := lHTTP.Get(HttpMsg);
It was easier than I thought. I was assuming that having a "host" paremeter that included the port would be enough but looking at a Wireshark capture I saw it was sending everything over the standard HTTP port.
So this did the trick
try
lHTTP := TIdHTTP.Create( nil );
lHTTP.Host := GatewayIp;
lHTTP.Port := GatewayPuerto;
responseStream := TMemoryStream.Create;
lHTTP.Request.CustomHeaders.Clear;
lHTTP.Request.CustomHeaders.Add('Host: ' + Host );
lHTTP.Get(HttpMsg, responseStream);
SetString( html, PAnsiChar(responseStream.Memory), responseStream.Size);
AnotarMensaje( odDepurar, 'IMPFIS: Impresora fiscal reservada ' + html );
I'm trying to do a POST request using an access_token, and it works fine using POSTMAN, but when I try to do the same request on Delphi, I can't find a way to add the "Authorization=Bearer eyxxxxxx..." to the Request header, as POSTMAN does.
POSTMAN Request (working well):
POST /somepath HTTP/1.1
Host: someurl.com.br
Authorization: Bearer eyJhbGciOiJSUzI1NiJ9.....
Content-Type: application/json
(body content ommited)
Indy Request generated by Delphi, captured by HTTP Analyzer (always returning 401 Forbidden error, because the absence of "Authorization=Bearer" part):
POST /somepath HTTP/1.1
Host: someurl.com.br
Content-Type: application/json
(body content ommited)
I've tried to add the header using the code below, but the header part with the "Authorization=Bearer eyxxxxxx..." isn't generated on Request, returning the 401 Forbidden error.
FIdHTTP.Request.CustomHeaders.FoldLines := False;
FIdHTTP.Request.CustomHeaders.Add('Authorization=Bearer ' + txtToken.Text);
Just found the problem. I added the wrong separator between the "Authorization" and "Bearer" words.
Wrong:
FIdHTTP.Request.CustomHeaders.FoldLines := False;
FIdHTTP.Request.CustomHeaders.Add('Authorization=Bearer ' + txtToken.Text);
Correct:
FIdHTTP.Request.CustomHeaders.FoldLines := False;
FIdHTTP.Request.CustomHeaders.Add('Authorization:Bearer ' + txtToken.Text);
After replacing the '=' by ':', I received the expected response, like the one received by POSTMAN.
I'm trying to send a POST request which needs to modify the Header.
Here is my code:
import (
"net/http"
"net/url"
"fmt"
)
const API_URL = "https://api.site.com/api/"
func SendOne(str string) {
v := url.Values{}
v.Add("source", "12345678")
v.Add("text", str)
client := &http.Client{nil, nil, nil}
req, err := http.NewRequest("POST", API_URL, strings.NewReader(v.Encode()))
if err != nil {
fmt.Println(err)
}
req.Header.Add("Authorization", "123456")
res, err := client.Do(req)
if err != nil {
fmt.Println(err)
}
defer res.Body.Close()
}
I have no idea why the code doesn't work. Any clue?
Thanks in advance.
Edit: I forgot to say I was using OAuth 2.0 for authorization.
Using tcpdump we can see that the request headers and body for the code you pasted looks like:
POST / HTTP/1.1
Host: example.com
User-Agent: Go 1.1 package http
Content-Length: 45
Authorization: 123456
Accept-Encoding: gzip
source=12345678&text=http%3A%2F%2Fexample.com
You mention in the comment above that if you add a Content-Type header it works. Doing the same process and dumping the communication between the two peers we get:
POST / HTTP/1.1
Host: example.com
User-Agent: Go 1.1 package http
Content-Length: 45
Authorization: 123456
Content-Type: application/x-www-form-urlencoded
Accept-Encoding: gzip
source=12345678&text=http%3A%2F%2Fexample.com
Which is exactly the same as the prior payload, except it now includes the provided Content-Type header. So, in terms of the behavior within the Go application itself, there's nothing special happening other than what you explicitly told it to do.
The reason why it works when you add the Content-Type header then must be that the actual server you're talking to wants to know how the content body you're providing is encoded.