I am running a website on IIS6 for ASP.NET application and enabled compression, which worked fine for the .aspx web pages.
What isn't working is downloaded binary files that are transmitted as part of a postback response: e.g. link button is 'download to excel' - user clicks and the code generates a binary file, does a Response.Clear() and then writes the file to the Response.OutputStream.
A response is seen by browsers but it is always zero bytes. I assume therefore that web browsers are expecting a compressed response, and as the raw binary isn't a valid compressed stream it fails. I'm puzzled as to why this should be if I've cleared the response - surely the response headers (specifying compression) are also cleared?
So two questions arise:
1) how do I compress the binary file to send a compressed response?
2) how can I check at runtime to see if IIS compression is enabled?
Cheers
I would disable compression and check whether this still works, just to isolate the fact that this is indeed due to IIS compression. I'm telling you that 'cos I'm running a IIS/Compression enabled site which provides PDF files (binary) without a problem.
Anyway here's the part of code which works for me.
Response.Clear();
Response.ClearHeaders();
Response.AddHeader("Content-Disposition", "attachment; filename=\"" + fileName + "\"");
Response.AddHeader("Content-Length", fileInfo.Length.ToString());
Response.AddHeader("Content-transfer-encoding", "8bit");
Response.AddHeader("Cache-Control", "private");
Response.AddHeader("Pragma", "cache");
Response.AddHeader("Expires", "0");
Response.ContentType = "application/pdf";
Response.WriteFile(filePath);
Response.Flush();
Response.Close();
HttpContext.Current.ApplicationInstance.CompleteRequest();
Related
When downloading a file, the Content-Disposition is encoded twice. This leads to unreadable file names for non-ascii characters.
This double encoding happens only when the request is being processed by the IIS rewrite module 2.0 and routed to a server farm. In this situation, when the code offers the file as a download, the response header, Content-Disposition, is double encoded. The browser decodes this once, which leaves with an encoded file name where non-ascii characters are unreadable.
In a situation where the same request is being processed without IIS rewrite module and ARR. The response header, Content-Disposition is encoded, but only once. In this case the browser decodes the filename and non-ascii characters are readable.
I've been looking into this for days now. I've only found 1 other question like this, but without an awnser.
For reference, here is the code used for the download file. (ASP.NET application)
In this code, report represents an entityframework object, and the filename is an unencoded string. We don't encode the filename. So i'm not even sure what is responsible for the first encoding.
System.Web.HttpResponse response = System.Web.HttpContext.Current.Response;
response.ClearContent();
response.Clear();
response.ContentType = "application/octet-stream";
response.AddHeader("Content-Disposition",
"attachment; filename=" + report.FileName);
response.TransmitFile(filePath);
response.Flush();
response.End();
Here are 2 images to display the issue:
Example download request header with IIS rewrite and ARR:
Example download request header without IIS rewrite and ARR
Any ideas on a solution? (Not using rewrite/arr/server farm is not an option)
EDIT:
Image of content-disposition found in failed tracelog.
It does look like ASP.NET automatically URL encodes the filename.
Makes it even more confusing.
Content-disposition-failed-trace-log
Additionally the inbound rewrite rule, that redirects traffic to a webfarm. (This webfarm is hosted in the same IIS on the same server)
Inbound-rewrite-rule-web-farm
I have created PDF file programmatically and downloading it. Without loadbalancer it is working fine but when request is sent via load balancer, it is blocking the response due to server sends more than Content-Length. When drop extra data checkbox is checked on the loadbalancer, application starts working and PDF download is working.
My question is what exactly might be causing server to send more than content-length of data? My download code is as follows.
Response.Clear()
Response.ClearHeaders()
Response.ContentType = "application/pdf"
Response.AddHeader("Content-Length", CStr(oOutputStream.Length))
Response.AddHeader("Content-Disposition", "attachment; Filename=abc.pdf")
Dim statement(CInt(oOutputStream.Length)) As Byte
oOutputStream.Position = 0
oOutputStream.Read(statement, 0, CType(oOutputStream.Length, Integer))
Response.BinaryWrite(statement)
Response.Flush()
Response.End()
What is the default caching behavior of IE 8 in absence of caching headers? If no headers have been set from the server side, how would it cache the response, or will it follow no-cache by default? The connection is secured i.e. HTTPS.
I am facing this problem where I write byte stream in response and the headers which I set are -
response.setContentType("application/pdf;charset=utf-8;");
response.setHeader("Content-Length", server.getContentLength() + "");
response.setHeader("Content-Disposition", "attachment; filename=\"" + fileName + "\"");
I know this fact that if a user tries to download a file over a HTTPS connection, any response headers that prevent caching will cause the file download process to fail in Internet Explorer, but in my case no headers for caching have been specified explicitly. Then also IE 8 says -
Unable to download.
Internet Explorer was unable to open this site. The requested site is either unavailable or cannot be found. Please try again later.
I am not getting the exact reason behind it. So, I can only make guesses about the default caching mechanism of IE 8. However it works fine in IE 9+ and in other browsers.
I found the exact reason for this. It was set by our own code, which does for every response for a particular request. So, now I know that these headers are set by us, and in the code which provides the export functionality, I just overrode them to work properly for IE 8 -
response.setHeader("Cache-Control", ""); // HTTP 1.1
response.setHeader("Pragma", ""); // HTTP 1.0
Our web application (ASP.NET Web Forms) has a page that will display a recently generated PDF file to users. Because the PDF file is sometimes quite large, we've implemented a "streaming" approach to send it down to the client browser in chunks.
Despite sending the data down in chunks, we know the full size of the file prior to sending it, so we set the Content-Length header appropriately. This has been working in our production environment for awhile (and continues to work in our test environment with a virtually identical configuration) until today. The issue reported was that Chrome would attempt to open the PDF file but would hang with the "Loading" animation stuck.
Because everything was still working fine in our test environment I was able to use Firebug to take a look at the response headers that were coming back in both environments. In the test environment, I was seeing a proper 'Content-Length' header, while in production that had been replaced with a Transfer-Encoding: chunked header. Chrome doesn't like this, hence the hang-up.
I've read some articles and posts talking about how the Transfer-Encoding header can show up when no Content-Length header is provided, but we are specifying the Content-Length header and everything still appears to work while running the same code for the same PDF file on a test server.
Both test and production servers are running IIS 7.5 and both have Dynamic and Static Compression enabled.
Here is the code in question:
var fileInfo = new FileInfo(fileToSendDown);
Response.ClearHeaders();
Response.ContentType = "application/pdf";
Response.AddHeader("Content-Disposition", "filename=test.pdf");
Response.AddHeader("Content-Length", fileInfo.Length.ToString());
var buffer = new byte[1024];
using (var fs = File.Open(file, FileMode.Open, FileAccess.Read, FileShare.Read))
{
int read;
while ((read = fs.Read(buffer, 0, 1024)) > 0)
{
if (!response.IsClientConnected) break;
Response.OutputStream.Write(buffer, 0, read);
Response.Flush();
}
}
I was fortunate to see the same behavior on my local workstation so using the debugger I have been able to see that the 'Transfer-Encoding: chunked' header is being set on the 2nd pass through the while loop during the call to 'Flush'. At that point, the response has both a Content-Length header and Transfer-Encoding header, but somehow by the time the response reaches the browser Firebug is only showing the Transfer-Encoding header.
UPDATE
I think I've tracked this down to using a combination of sending the data down in "chunks" AND attaching a 'Filter' to the HttpResponse object (we were using a filter to track the size of viewstate being sent down to each page). There's no sense in us using an HTTP filter when sending a PDF down to the browser, so clearing the filter here has resolved our issue. I decided to dig in a little deeper purely out of curiosity and have updated this question should anyone else ever stumble onto this problem in the future.
I've got a simple app up on AppHarbor that reproduces the issue: http://transferencodingtest.apphb.com/. If you check both the 'Use Filter?' and 'Send In Chunks?' boxes you should be able to see the 'transfer-encoding: chunked' header show up (using Chrome dev tools, Firebug, Fiddler, whatever). If either of the boxes are not checked, you'll get a proper content-length header. The underlying code is up on github so you can see what's going on behind the scenes:
https://github.com/appakz/TransferEncodingTest
Note that to repro locally you'd need to setup a local website in IIS 7.5 (7 may also work, I haven't tried). The ASP .NET development server that ships with Visual Studio DOES NOT repro the issue.
I've added some more details to a blog post here: 'Content-Length' Header Replaced With 'Transfer-Encoding: Chunked' in ASP .NET
From an article on MSDN it seems that you can disable chunked encoding:
appcmd set config /section:asp /enableChunkedEncoding:False
But it's mentioned under ASP settings, so it may not apply to a response generated from an ASP.NET handler.
Once Response.Flush() has been called, the response body in in the process of being sent to the client, so no additional headers can be added to the response. I find it very unlikely that a second call to Response.Flush() is adding the Transfer-Encoding header at that time.
You say you have compression enabled. That almost always requires a chunked response. So it would make sense that if the server knows the Content-Length prior to compression, it might substitute that header for the Transfer-Encoding header and chunk the response. However, even with compression enabled on the server, the client has to explicitally state support for compression in its Accept-Encoding request header or else the server cannot compress the response. Did you check for that in your tests?
On a final note, since you are calling Response.Flush() manually, try setting Response.Buffer = True and Response.BufferOutput = False. Apparently they have conflicting effects on how Response.Flush() operates. See the comment at the bottom of this page and this page.
I had a similar problem when I was writing a large CSV (The file didn't exist I write a string line by line by iterating through an in memory collection and generating the line) by calling Response.Write on the Response stream with BufferOutput set to false, but the solution was to change
Reponse.ContentType = 'text/csv' to Reponse.ContentType = 'application/octet-stream'
When the content type wasn't set to application/octet-stream a bunch of other response headers were added such as Content-Encoding - gzip
In my server side code I'm creating the following response object:
var response = HttpContext.Current.Response;
response.Clear();
response.AddHeader("Content-Type", content.Type);
response.AddHeader("Content-Length", content.Length.ToString());
response.AddHeader("Content-Disposition",
string.Format("attachment; filename={0}; size={1}", Server.UrlEncode(content.FileName), content.Length.ToString()));
response.Flush();
response.BinaryWrite(content.Image);
response.Flush();
Is it possible something in this response object is causing the 403 and if so what? If I just run a blank aspx page I don't get a 403 it is only when it goes through the above code that a 403 occurs.
The issue is unlikely to be a mime type issue (so probably not related to the content type being text/plain), though that's an issue in itself.
Chances are that permissions are not setup correctly, or that .NET is not registered on the server properly. Try running aspnet_regiis on the server again.