HTTP Response Behaviour - asp.net

I have two very similar pieces of ASP.NET code that send a file in an HTTP Reponse to the client. They should cause the browser to prompt to save the file. The first one works, the second one doesn't. The HTTP responses as seen in Fiddler are below.
Working:
HTTP/1.1 200 OK
Cache-Control: private
Content-Length: 228108
Content-Type: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 4.0.30319
content-disposition: attachment; filename=Report.xlsx
Date: Wed, 05 Jan 2011 12:17:48 GMT
<binary data>
Not working:
HTTP/1.1 200 OK
Server: ASP.NET Development Server/10.0.0.0
Date: Wed, 05 Jan 2011 12:19:21 GMT
X-AspNet-Version: 4.0.30319
Content-Length: 228080
content-disposition: attachment; filename=report 2.xlsx
Cache-Control: private
Content-Type: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
Connection: Close
<binary data>
When the first one is seen in Fiddler the browser correctly prompts to save the file. When the second one is seen in Fiddler, nothing observable happens in the browser. Same behaviour in both chrome and firefox.
Any idea why this is happening?
EDIT: ASP.NET code that produces the second response
Response.Buffer = false;
Response.ContentType = #"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
Response.AppendHeader("content-length", genstream.Length.ToString());
Response.AppendHeader("Content-Disposition", string.Format("attachment; filename={0}.xlsx", filename));
byte[] buffer = new byte[1024];
genstream.Position = 0;
int n;
while ((n = genstream.Read(buffer, 0, 1024) ) > 0)
{
Response.OutputStream.Write(buffer, 0, n);
}

The space in the filename parameter value might cause this. Try the quoted-string syntax instead:
Content-Disposition: attachment; filename="report\ 2.xlsx"
See also RFC 2183.

I'm actually a little surprised the first one is working - I thought browsers were pretty picky about case in HTTP headers.
Try changing the "content-disposition" header to "Content-Disposition".

Problem is here:
Connection: Close
A lot of browsers - especially for downloads - use 100-and-continue to read the headers and check the length of the content. The second will not allow the browser to do that.
(UPDATE)
This is generated because of this line:
Response.Buffer = false;
Remove it and it should work as a charm!

Apparently it wasn't a problem with the response but a problem with the request. Both the HTTP responses in the OP are valid, but the link on the page that produced the second one was inside an asp ajax panel (UpdatePanel). I've been staring at this problem for too long and know too little about HTTP protocol to look into the exact cause of it but the differences in the request headers were these fields:
Working (outside ajax panel):
Cache-Control: max-age=0
Content-Type: application/x-www-form-urlencoded
Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
Not working (inside ajax panel):
X-Requested-With: XMLHttpRequest
X-MicrosoftAjax: Delta=true
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
Cache-Control: no-cache
Accept: */*
Problem is now gone after removing the link from the ajax panel. "Connection: Close" is still in the (now working) header so that was obviously nothing to do with the problem.
Thanks for the help!

Related

Swagger UI Download PDF

I am using swagger-UI 2.1.3 for API documentation and in backend, I am using spring-webmvc .
I have one API which returns a pdf file, it works fine if I type the URL in browser (it popups a download and downloaded file works fine)
But the same api won't works in swagger ui, it gives me a download link after clicking on "try out", and that link downloads a file, but that file shows me blank pdf pages (corrupted pdf file) .
The response headers are following
HTTP/1.1 200 OK
Server: Apache-Coyote/1.1
Cache-Control: no-cache, no-store, max-age=0, must-revalidate
Pragma: public
Expires: 0
Access-Control-Allow-Origin: *
Content-Description: File Transfer
Content-Transfer-Encoding: binary
Transfer-Encoding: chunked
Content-Disposition: attachment; filename="example.pdf"
Access-Control-Expose-Headers: Content-Description,Content-Disposition,location
Content-Type: application/pdf
Content-Length: 268288
Date: Mon, 04 Jan 2016 12:18:16 GMT
Any solution around this ?
Addition Info:
This questions seems similar -
AngularJS: Display blob (.pdf) in an angular app
There they are saying that set responseType to arraybuffer in xhr, But i think that swaggar will take care of that (maybe i need to set some configuration??)

POST https://www.linkedin.com/uas/oauth2/accessToken HTTP/1.1 results in Method Not Found on LinkedIn

I am using the LinkedIn Owin Middleare and started running into issues this morning and have now reproduced it to the below error:
POST https://www.linkedin.com/uas/oauth2/accessToken HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: www.linkedin.com
Cookie: bscookie="v=1&201504071234373bc02b47-9d08-477f-8375-b80b281ef416AQEptFjv8jXPI93YmF-H-3kvnwSLwBF8"; bcookie="v=2&46f6f299-6702-48bf-8634-7ba023bd5099"; lidc="b=LB23:g=218:u=215:i=1428412320:t=1428487523:s=AQEQQq6vlEKPT3LW8c0cPEzRTKp-ToxL"
Content-Length: 267
Expect: 100-continue
Connection: Keep-Alive
grant_type=authorization_code&code=AQQRSgEH8vczSFJKNxtMpunzjYN6YJxoF2hiX_d9RVkqBvMC7TzRpur0p9NJFdQOUNf8RmFyj_cCg3ENTucRw5e-gQfEZ5sPGoujiFRsQ8Tb0pLnaog&redirect_uri=http%3A%2F%2Flocalhost%3A1729%2Fsignin-linkedin&client_id=&client_secret=
Results in method not found.
HTTP/1.1 405 Method Not Allowed
Date: Tue, 07 Apr 2015 13:13:16 GMT
Content-Type: text/html
Content-Language: en
Content-Length: 5487
X-Li-Fabric: PROD-ELA4
Strict-Transport-Security: max-age=0
Set-Cookie: lidc="b=LB23:g=218:u=215:i=1428412396:t=1428487523:s=AQExeP2uX-7KXQv79NIZmW0LB09uE4eJ"; Expires=Wed, 08 Apr 2015 10:05:23 GMT; domain=.linkedin.com; Path=/
Pragma: no-cache
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Cache-Control: no-cache, no-store
Connection: keep-alive
X-Li-Pop: PROD-IDB2
X-LI-UUID: 0FM/jIG90hPAzyhAqCsAAA==
Looking for anyone to confirm that there was a change on linkedin causing this error and that its not application specific.
Note that i removed teh above clientid/secrets.
I also spent most of the morning off and on trying to get this to work. Frustratingly it worked fine using Advanced Rest Client chrome tool. A combination of this and fiddler showed the only difference in the header was that Expect: 100-continue flag in the header. The only way I was able to get it to be set to false was in the web.config section
<system.net>
<settings>
<servicePointManager expect100Continue="false" />
</settings>
</system.net>
Hope this helps.
I ran into this issue this morning too (I'm using DotNetOpenAuth). It looks like this is related to the use of the following request header: Expect: 100-continue
After removing this request header, the HTTP/1.1 405 Method Not Allowed response no longer occurs. Obviously this isn't much help if you don't have access to the source code!
I'm assuming this is due to a change in LinkedIn as I only started experiencing problems this morning. I'm guessing they'll need to look into a fix for this.
I started having this issue today. After some research about Expect: 100-continue I found that putting
System.Net.ServicePointManager.Expect100Continue = false;
in my Application_Start() function inside of Global.asax, takes out the 100-continue from the request and my login with LinkedIn is now working again.
Not a permanent fix as I would like to now why it broke in the first place.
I had same issue also use DotNetOpenAuth.
How I fix:
I remove from request header "Expect: 100-continue"
in my case redirect_uri was encoded and I remove encode for redirect_uri (for request to https://www.linkedin.com/uas/oauth2/accessToken )
For those using Owin Middleware and Owin.Security.Providers
A pre-release nuget was created with a fix.
https://www.nuget.org/packages/Owin.Security.Providers/1.17.0-pre
This works for now. But until we know what linkedin has changed or comes with statement about what they changed people can use this as a hotfix.
Alittle more background on the fix can be found at :
https://github.com/RockstarLabs/OwinOAuthProviders/issues/87#issuecomment-90838017
But the root cause is that LinkedIn changed something on there accessToken endpoint causing most of the libs using linkedin SSO had to apply a hotfix, but we yet haven't heard anything from linkedin.
Found a solution for curl, pretty simple:
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Expect:') );

Debugging Multiple Disposition Headers

I am developing a web application using Java and keep getting this error in Chrome on some particular pages:
net::ERR_RESPONSE_HEADERS_MULTIPLE_CONTENT_DISPOSITION
So, I checked WireShark for the corresponding TCP stream and this was the header of the response:
HTTP/1.0 200 OK
Date: Mon, 10 Sep 2012 08:48:49 GMT
Server: Apache-Coyote/1.1
Content-Disposition: attachment; filename=KBM 80 U (50/60Hz,220/230V)_72703400230.pdf
Content-Type: application/pdf
Content-Length: 564449
X-Cache: MISS from my-company-proxy.local
X-Cache-Lookup: MISS from my-company-proxy.local:8080
Via: 1.0 host-of-application.com, 1.1 my-company-proxy.local:8080 (squid/2.7.STABLE5)
Connection: keep-alive
Proxy-Connection: keep-alive
%PDF-1.4
[PDF data ...]
I only see one content disposition header in there. Why does chrome tell me there were several?
Because the filename parameter is unquoted, and contains a comma character (which is not allowed in unquoted values, and in this case indicates that multiple header values have been folded into a single one).
See http://greenbytes.de/tech/webdav/rfc2616.html#rfc.section.4.2.p.5 and http://greenbytes.de/tech/webdav/rfc6266.html

Asp.net http handler text file download issue. Works fine for codebehined

Recently we are encountering strange issue with file download http handler developed using C# 4.0.
The web application is developed using ASP.NET 4.0 and hosted on IIS 7.0 over ssl. It worked correctly. But recently due to some changes else where in config or website we are facing the issue listed below.
When we download the text file it emits junk data. The same file works fine if i use code behind on aspx page instead of handler. Both have same code. Some of the files works fine. for e.g. image file or pdf file works fine. But with text file the behavior is very inconsistent. Blank text file works fine. I tried comparing the two responses (handler vs codebehind) and it seems that content-length returned is not same.
context.Response.Clear();
context.Response.ClearHeaders();
context.Response.ClearContent();
context.Response.ContentType = !String.IsNullOrEmpty(mime) ? mime : "application/octet-stream";
context.Response.AppendHeader("Content-Disposition", String.Format("attachment; filename={0}", fileName));
//context.Response.AppendHeader("Content-Length", buffer.Length.ToString());
context.Response.OutputStream.Write(buffer, 0, buffer.Length);
context.Response.End();
CODE BEHIND
HTTP/1.1 200 OK Server: ASP.NET Development Server/10.0.0.0 Date: Thu,
06 Oct 2011 02:52:26 GMT X-AspNet-Version: 4.0.30319
Content-Disposition: attachment; filename=my junk.txt Cache-Control:
private Content-Type: text/plain Content-Length: 29 Connection: Close
This is for sample test only
HTTPHANDLER
HTTP/1.1 200 OK Server: ASP.NET Development Server/10.0.0.0 Date: Thu,
06 Oct 2011 02:54:04 GMT X-AspNet-Version: 4.0.30319
Content-Disposition: attachment; filename=my junk.txt Cache-Control:
private Content-Type: text/plain Content-Length: 146 Connection: Close
��������I�%&/m�{J�J��t��$ؐ#������iG#)���eVe]f#�흼��{���{���;�N'���?\fdl��J�ɞ!���?~|?"�yѤ��N�l��͛6������ ������I���
Try to use AddHeader() instead of AppendHeader() and invoke Flush() before context.Response.End() statement.
Another thing you may want to do is surround the file name in quotes. Chrome/Opera do not handle this code correctly when the file name has a comma in it and think there are duplicate response headers.
context.Response.AddHeader("Content-disposition", string.Format("attachment; filename=\"{0}\""), fileName);
See here, here, and here for more information.

Http protocol content-length

I am working on a simple download application. While making a request for the following file both firefox and my application doesn't get the content-length field. But if i make the request using wget server does send the content-length field. I did change wgets user agent string to test and it still got the content-length field.
Any ideas why this is happening?
wget request
---request begin---
GET /dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4 HTTP/1.0
User-Agent: test
Accept: */*
Host: media.defcon.org
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.0 200 OK
Server: lighttpd
Date: Sun, 05 Apr 2009 04:40:08 GMT
Last-Modified: Tue, 23 May 2006 22:18:19 GMT
Content-Type: video/mp4
Content-Length: 104223909
Connection: keep-alive
firefox request
GET /dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4 HTTP/1.1
Host: media.defcon.org
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.4; en-US; rv:1.9.0.8) Gecko/2009032608 Firefox/3.0.8
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Keep-Alive: 300
Connection: keep-alive
Referer: http://www.defcon.org/html/links/defcon-media-archives.html
Pragma: no-cache
Cache-Control: no-cache
HTTP/1.x 200 OK
Server: lighttpd
Date: Sun, 05 Apr 2009 05:20:12 GMT
Last-Modified: Tue, 23 May 2006 22:18:19 GMT
Content-Type: video/mp4
Transfer-Encoding: chunked
Update:
Is there a header that I can send that will tell Lighthttpd not to use chunked encoding.My original problem is that I am using urlConnection to grab the file in my java application which automatically sends HTTP 1.1 request.
I would like to know the size of the file so i can update my percentage.
GET
/dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4
HTTP/1.1
Firefox is performing an HTTP 1.1 GET request. Lighthttpd understands that the client will support chunked-transfer encoding and returns the content in chunks, with each chunk reporting its own length.
GET
/dc-13/video/2005_Defcon_V2-P_Zimmerman-Unveiling_My_Next_Big_Project.mp4
HTTP/1.0
Wget on the other hand performs an HTTP 1.0 GET request. Lighthttpd, understanding that the client doesn't support HTTP 1.1 (and thus chunked-transfer encoding), returns the content in one single chunk, with the length reported in the response header.
Looks like it's because of the chunked transfer encoding:
Transfer-Encoding: chunked
This will send the video down in chunks, each with its own size. This is defined in HTTP 1.1, which is what Firefox is using, while wget is using HTTP 1.0, which doesn't support chunked transfer encoding, so the server has to send the whole file at once.
I was having the same problem and found a solution regardless of which HTTP version:
First use a HEAD request to the server which correctly responds with just the HTTP header and no contents. This header correctly includes the wanted Content-Length: bytes size for the file to download.
Proceed with the GET request to download the file (the header from the GET response fails to include Content-length).
An Objective-C language example:
NSString *zipURL = #"http://1.bp.blogspot.com/_6-cw84gcURw/TRNb3PDWneI/AAAAAAAAAYM/YFCZP1foTiM/s1600/paragliding1.jpg";
NSURL *url = [NSURL URLWithString:zipURL];
// Configure the HTTP request for HEAD header fetch
NSMutableURLRequest *urlRequest = [NSMutableURLRequest requestWithURL:url];
urlRequest.HTTPMethod = #"HEAD"; // Default is "GET"
// Define response class
__autoreleasing NSHTTPURLResponse *response;
// Send HEAD request to server
NSData *contentsData = [NSURLConnection sendSynchronousRequest:urlRequest returningResponse:&response error:nil];
// Header response field
NSDictionary *headerDeserialized = response.allHeaderFields;
// The contents length
int contents_length = [(NSString*)headerDeserialized[#"Content-Length"] intValue];
//printf("HEAD Response header: %s\n",headerDeserialized.description.UTF8String);
printf("HEAD:\ncontentsData.length: %d\n",contentsData.length);
printf("contents_length = %d\n\n",contents_length);
urlRequest.HTTPMethod = #"GET";
// Send "GET" to download file
contentsData = [NSURLConnection sendSynchronousRequest:urlRequest returningResponse:&response error:nil];
// Header response field
headerDeserialized = response.allHeaderFields;
// The contents length
contents_length = [(NSString*)headerDeserialized[#"Content-Length"] intValue];
printf("GET Response header: %s\n",headerDeserialized.description.UTF8String);
printf("GET:\ncontentsData.length: %d\n",contentsData.length);
printf("contents_length = %d\n",contents_length);
return;
And the output:
HEAD:
contentsData.length: 0
contents_length = 146216
GET:
contentsData.length: 146216
contents_length = 146216
(Note: This example URL does correctly provides the header Content-Length from the GET response, but shows the idea if it failed to)

Resources