POST command missing BODY parameters in SSL (https:), but not in http: - asp.net

Working on some code that that I inherited from a non-responsive initial developer. My ASP.NET and web.config are at best "dated", thus I'm turning to the community for some help. One of the first things I had to change is to force this website to operate in SSL (https:) as it deals with sensitive data. The program immediately stopped working and I had to make some undesirable changes to code that "already worked". And it still seems broken, and the changes won't make the client happy.
This is an ASP.NET project that seems hand-rolled.
Sending a POST command with some body text that (I think) is JSon setting additional parameters to the POST command such as: "indexID=8379fcd1-5083-4d1c-a6ee-5812f134a505".
As far as I can tell, this works as intended on non SSL (i.e. http: requests). However, when running in SSL (i.e. https: requests), it appears that the BODY (Json text) isn't getting decoded into the HttpContect.Current.Request parameters (which seems to be happening in http:).
However the post_data that I can read from the input stream has the JSon body text (as clear text?) with the parameters, which my 'fix' adds to the incoming HttpContect.Current.Request parameters as a combined dictionary.
[Here is the RAW command intercepted with Fiddler] POST https://vmdev-xpp/BuilderQC/Services/Data.svc/QueryGridResults?typename=ImportReadyForDownload HTTP/1.1 Host: vmdev-xpp User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:92.0) Gecko/20100101 Firefox/92.0 Accept: application/json, text/javascript, /; q=0.01 Accept-Language: en-US,en;q=0.5 Accept-Encoding: gzip, deflate, br Content-Type: application/x-www-form-urlencoded X-Requested-With: XMLHttpRequest Content-Length: 378 Origin: https://vmdev-xpp Connection: keep-alive Referer: https://vmdev-xpp/BuilderQC/BREDFileManagement.aspx Cookie: ASP.NET_SessionId=ehi2ccsdmekvkgegmfh11n1v; .ASPXLMPTest=2226D5725FBC10FBCCD606108CE5A4E32990EEA8FF8A1864496F69874F116D7E1ABF48A8BFD05EE683FE3F456D4475E88A61B19B299CB557209129BD25E87AC38CECA5303C7E2035E64C1F5A4AD2605D8581181A9C7E48680371F83BC7A93D7A63D8748EA4761A608F424578C20127D01DE0E2FBFBD5F079575E86FD506925D541026B7C8713FDEFE108BCEADBFC1DA0 Sec-Fetch-Dest: empty Sec-Fetch-Mode: cors Sec-Fetch-Site: same-origin
_search=true&nd=1629052554858&rows=40&page=1&sidx=&sord=asc&QC_ProjectID=14&FileReadyForDownload=1&filters=%7B%22fields%22%3A%5B%7B%22field%22%3A%22QC_ProjectID%22%2C+%22op%22%3A+%22cn%22%2C+%22value%22%3A%2214%22%7D%2C%7B%22field%22%3A%22FileReadyForDownload%22%2C+%22op%22%3A+%22cn%22%2C+%22value%22%3A%221%22%7D%5D%7D&indexID=8379fcd1-5083-4d1c-a6ee-5812f134a505&entity=false
[Here is the post_data I obtained from the input stream, I think that this being in clear-text is suspect] Post_data = _search=true&nd=1629052767566&rows=40&page=1&sidx=&sord=asc&QC_ProjectID=14&FileReadyForDownload=1&filters=%7B%22fields%22%3A%5B%7B%22field%22%3A%22FileName%22%2C+%22op%22%3A+%22cn%22%2C+%22value%22%3A%22th%22%7D%2C%7B%22field%22%3A%22QC_ProjectID%22%2C+%22op%22%3A+%22cn%22%2C+%22value%22%3A%2214%22%7D%2C%7B%22field%22%3A%22FileReadyForDownload%22%2C+%22op%22%3A+%22cn%22%2C+%22value%22%3A%221%22%7D%5D%7D&indexID=8379fcd1-5083-4d1c-a6ee-5812f134a505&entity=false&FileName=th
Here is the incoming HttpContext.Current.Request.Params.AllKeys, Notice the lacking "indexID" among other parameters

Related

Micronaut's netty server http request missing headers

I'm trying to integrate custom authentication service with micronaut security and to do this I've implemented my own AuthenticationProvider and that works fine for basic auth, however I also need to take care of authentication tokens passed in the request.
To do this I'm trying to implement my own AuthenticationFetcher and in the fetchAuthentication method I'm trying to get my custom authentication header and then authenticate the request.
#Override
public Publisher<Authentication> fetchAuthentication(HttpRequest<?> request) {
if (request.getHeaders().get(authConfiguration.getTokenHeader()) != null) {
The issue I'm having is that netty's request.getHeaders() doesn't return all headers that are being sent to the webservice (I confirmed from my browsers developer console)
GET /service/all HTTP/1.1
Accept: application/json, text/plain, */*
Cookie: m=2258:Z3Vlc3Q6Z3Vlc3Q%253D
Accept-Encoding: gzip, deflate
Host: localhost:4200
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1 Safari/605.1.15
Accept-Language: pl-pl
Referer: http://localhost:4200/campaigns
Connection: keep-alive
X-Token: my.token.here
And here are my app settings
micronaut:
server:
netty:
maxHeaderSize: 1024
worker:
threads: 4
parent:
threads: 4
childOptions:
autoRead: true
application:
name: appName
Any feedback appreciated.
It was caused by cors config:
micronaut:
server:
cors:
enabled: true
After adding this my request header was removed in the filter chains.

How to copy HTTP headers from Charles to Postman

I have a problem with recreating the headers, everything seem identical, but it just doesn't work. Need those headers to access Instagram API
I tried to use Charles to intercept a traffic from mobile device and it's working as expected, but I'm struggling to recreate the same headers.
URL is https://i.instagram.com/api/v1/feed/user/7499201770/reel_media/
Headers are
:method: GET
:scheme: https
:path: /api/v1/feed/user/7499201770/reel_media/
:authority: i.instagram.com
content-type: application/json
authority: i.instagram.com
accept: */*
path: /api/v1/feed/user/7499201770/reel_media/
accept-language: en-IN;q=1.0
accept-encoding: gzip;q=1.0, compress;q=0.5
content-length: 2
user-agent: Instagram 10.29.0 (iPhone7,2; iPhone OS 9_3_3; en_US; en-US; scale=2.00; 750x1334) AppleWebKit/420+
referer: https://www.instagram.com/
x-ig-capabilities: 3w==
cookie: ds_user_id=6742557571; sessionid=IGSCf716eb61bf2a6d41f...
I tried to use Postman in order to recreate this request, but every time I get the same error "Login required". How should I paste those headers? I can't understand that
It was the user-agent: Instagram 10.29.0 (iPhone7,2; iPhone OS 9_3_3; en_US; en-US; scale=2.00; 750x1334) AppleWebKit/420+ that I didn't copy
With user-agent it works, so HTTP headers will look like this in case someone writing an instagram story saver)
["Content-Type": "application/json",
"Accept-encoding": "gzip, deflate",
"User-agent": "Instagram 10.29.0 (iPhone7,2; iPhone OS 9_3_3; en_US; en-US; scale=2.00; 750x1334) AppleWebKit/420+",
"Cookie": "ds_user_id=67425...; sessionid=IGSCf716eb61b....]

HTTP Chunked transfer encoding

Thats from wikipedia:
For version 1.1 of the HTTP protocol, the chunked transfer mechanism is considered to be always and anyways acceptable, even if not listed in the TE (transfer encoding) request header field
Thats what I get from clients (Mozilla, Opera):
GET http://www.google.com/ HTTP/1.1
Host: www.google.com
User-Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:45.0) Gecko/20100101 Firefox/45.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
Apparently there is neither Transfer-Encoding field there, nor I see any chunks (I've checked with HEX editor, no additional symbols).
I open connection as follows (Python)
socket.socket(socket.AF_INET, socket.SOCK_STREAM)
Is it lower level handling joins chunks into message? Is so, how can I know where the HTTP message ends so that I can stop reading the request and start handling it?
You should read the specification.
But simply, in this case, since it's a GET, and there's not content, there's not going to be a Content-Length header. So, you stop reading when you get the empty line with just a CR/LF.
Otherwise, you read past that blank line, and read Content-Length bytes.

MSXML client side XSLT does not send accept-language header

When using client side XSLT in IE9 I noticed that IE sends different headers for requests that fetch the XSL and subsequent requests triggered via the document() method, than for requesting the original XML file. In particular the accept-language header is missing completely.
The bootstrap XML looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="transform.xsl"?>
<root/>
and the XSLT like this
...
<body>
<xsl:apply-templates select="document('section.xml')"/>
</body>
...
What I notice is that both the XSLT as well as the section.xml file are loaded with an HTTP request without an accept language header.
The request headers to fetch the XML file look like this:
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-US,de-DE;q=0.5
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)
Accept-Encoding: gzip, deflate
Connection: Keep-Alive
whereas the other resources are loaded with
Accept: */*
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0)
Connection: Keep-Alive
Is is a feature or a bug? Other browsers such as FF or Chrome send identical headers.
A working example can be found on my test server
This effect causes problems in a real life project, because the XML files are generated dynamically and contain end-user facing content that is negotiated based on the accept-language header. This fails because no header is sent by the transformer.
Any insight or suggestions for workarounds are welcome!
Thanks!
Carsten
I vote for "bug" since it seems more logical to repeat the accept-language header for the dependent requests (not sure if that would be specified anywhere). Could you carry the language preference information as a query parameter for the request fetching the XSL?

Why would a browser make two separate requests for the same file?

I'm debugging a program I wrote and noticed something strange. I set up an HTTP server on port 12345 that servers a simple OGG video file, and attempted to access it from Firefox.
Upon sniffing the network requests, I found these two requests were made:
GET /video.ogv HTTP/1.1
Host: 127.0.0.1:12345
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
GET /video.ogv HTTP/1.1
Host: 127.0.0.1:12345
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Range: bytes=8122368-
The video is almost 8 MB in size, so the fact that the second request specifics 8122368 bytes, which is 7932 KB, suggests it is requesting the very end of the file for some reason. Anyone have ideas?
In order to support seeking and playing back regions of the media that aren't yet downloaded, Gecko uses HTTP 1.1 byte-range requests to retrieve the media from the seek target position. So because Ogg files don't contain their duration, the initial download connection is terminated. Then there is a seek to the end of the Ogg file and read a bit of data to extract the time duration of the media. Info from here and here.
Some media format have meta data at the end of the file, and this data is usually required to allow proper seeking of the video.
Its actually requesting 8122368 bytes starting backwards from the end. Which is 7.74MB if I did my calcs correctly.
it might be something in how the buffering for that file type is done.

Resources