I've got a chunk of ASP.Net 2.0 code in the Page Load handler that looks basically like this:
Response.Clear();
Response.ContentType="application/pdf";
Response.TransmitFile("foo.pdf");
Response.End();
It works fine with all browsers when running through either IIS or Cassini. But when I try to run it through Apache using mod_aspdotnet.so (which I really need to support and generally has no weirdness), I get a variety of bad behavior. With Chrome, Firebird, and IE, I get an "OK 200" page saying, "The server encountered an internal error or misconfiguration and was unable to complete your request." With Safari, it winds up reloading the page.
I've tried it with other file types, different ContentType, WriteFile instead of TransmitFile, using AddHeader to supply Content-Length and Content-Disposition, and BufferOutput. In short, I'm running out of ideas on how to even go about figuring out what's wrong. Any ideas appreciated.
kd
I finally got this to work. I don't expect many (any?) other people to be in this boat, but if you are, here's what works:
Response.Clear();
Response.ContentType="application/pdf";
f=new FileStream(targetFile, FileMode.Open);
byte[] b=new byte[(int)f.Length];
f.Read(b, 0, f.Length);
f.Close();
Response.BinaryWrite(b);
Response.Flush();
Using .NET 4.0 on IIS 7.5 on Windows 2008 R2.
I would like to output a binary content which represents various types of documents (images, PDF, Office files, etc). Let's assume the entire content is already in a MemoryStream, I would like to output it by:
Response.Clear();
Response.AddHeader("Content-Disposition", string.Format("attachment; filename={0}", fileNameSaveAs));
Response.AddHeader("Content-Length", memoryStr.Length.ToString());
Response.ContentType = "application/octet-stream";
Response.OutputStream.Write(memoryStr.ToArray(), 0, (int) memoryStr.Length);
Response.Flush();
The code above is not reliable. There are often file corruption. The clients using various browsers, sometimes have an aborted download, sometimes download a file which is unreadable. The likelihood of having a corruption increases with the file size. Using fiddler, we found out that the response header reported a content length different than the original file size. So for a quick test we commented out the line Response.AddHeader("Content-Length" ...) and the corruption issue disappeared.
Q1: Is this issue caused by the Dynamic Compression (enabled on IIS7 by default)?
Q2: If answer to Q1 is yes, then is there any elegant solution to inform the client about the Content-Length?
Q3: Removing the "Content-Length" header seems to affect the ability of the client to save file as. Example: "Content-Disposition", is initalized with fileNameSaveAs = "One Two Three.pdf". Using Firefox, when receiving the file, the download dialog defaulted to "One" as filename. Is it a normal consequence?
Thanks in advance for any help.
made more tests, clarified a few things but not technically satisfactory.
A1. IIS 7.5 Dynamic Compression is not the cause. Download corruption still occurred whether Dynamic Compression, Static Compression, or both is disabled. As soon as the line Response.AddHeader("Content-Length" ... is commented out in the code. All download issue disappeared.
A2. no idea! I really would like to know.
A3. This behavior is probably a Firefox bug. This has nothing to do with the "Content-Length" header.
I'm running ASP.NET on an IIS6 server. Right now the server is set up to compress dynamically generated content, mainly to reduce the page size of ASPX files that are being retrieved.
Once of the ASPX files has the following bit of code, used to fetch a file from the database and send it to the user:
Response.Clear();
Response.Buffer = true;
Response.ContentType = Document.MimeType;
Response.AddHeader("content-disposition", "attachment;filename=\"" + Document.Filename + Document.Extension + "\"");
Response.AddHeader("content-length", Document.FileSizeBytes.ToString());
byte[] docBinary = Document.GetBinary();
Response.BinaryWrite(docBinary);
The download itself works perfectly. However, the person downloading the file doesn't get a progress bar, which is incredibly annoying.
From what research I've been doing, it seems that when IIS sets the transfer-encoding to chunked when compressing dynamic content, which removes the content-length header as it violates the HTTP1.1 standard when doing that.
What's the best way to get around this without turning dynamic compression off at the server level? Is there a way through ASP.NET to programatically turn off compression for this response? Is there a better way to go about doing things?
You can turn on/off compression at the site or folder level by modifying the metabase. For more information see:
Enabling HTTP Compression (IIS 6.0)
Scroll down to: "To enable HTTP Compression for Individual Sites and Site Elements"
To do this you need elevated rights (Administrator at least).
You might have to place the download page in it's own folder and turn off compression at that level so as not to affect other parts of the site.
I have to admit I've not tried this but it's what I'd attempt first.
I get the following error when downloading excel file in SSL site:
Internet Explorer cannot download xxxx.aspx from mysite.com.
Internet Explorer was not able to open this Internet site. The requested site is either unavailable or cannot be found. Please try again later.
After googling, I suspect that it's the problem of the response header.
I try the solution in this page and set the header:
http://trac.edgewall.org/ticket/1020
HttpContext.Current.Response.AddHeader("Pragma", "no-cache");
HttpContext.Current.Response.CacheControl = "private";
But it doesn't work.
Any suggestions?
Take a look at this article. It's from the horse's mouth so to speak :) We actually just faced this same issue when we switched to a full SSL session.
I also faced the same issue,
When I googled it, I found that "no chache" settings in response header i.e. following code is the reason for the issue.
Response.AppendHeader("Pragma", "no-cache")
Response.AppendHeader("Cache-Control", "no-cache")
Response.AppendHeader("max-age", "0")
Some of the blogs says, to fix this issue, you should do some modification in Windows registry on Webserver and on all client machines(:O), well it is not feasible to do registry settings on every client machine.
The root cause is no-cache settings in response header, so I just added
Response.ClearHeaders()
before adding content to be downloaded to the response header. The code is as shown below,
Response.ClearHeaders()
Response.ContentType = corspBean.ContentType
Response.AppendHeader("content-disposition", "attachment; filename=""" + fileName + """")
Response.BinaryWrite(fileBytes)
Response.End()
It has fixed the issue.
Enjoy!!!
I had a similar problem with PDF files I wanted to stream. Even with Response.ClearHeaders() I saw Pragma and Cache-Control headers added at runtime. The solution was to clear the headers in IIS (Right-click -> Properties on the page loading the PDF, then "Http headers" tab).
I ran into that same problem using LocalReport when exporting to excel.
Just before the
Response.BinaryWrite(bytes);
we added
Response.ClearHeaders();
Response.ClearContent();
Response.Buffer = true;
and it worked.
I am streaming a PDF to the browser in ASP.NET 2.0. This works in all browsers over HTTP and all browsers except IE over HTTPS. As far as I know, this used to work (over the past 5 years or so) in all versions of IE, but our clients have only recently started to report issues. I suspect the Do not save encrypted pages to disk security option used to be disabled by default and at some point became enabled by default (Internet Options -> Advanced -> Security). Turning this option off helps, as a work-around, but is not viable as a long term solution.
The error message I am receiving is:
Internet Explorer cannot download OutputReport.aspx from www.sitename.com.
Internet Explorer was not able to open this Internet site. The requested site is either unavailable or cannot be found. Please try again later.
The tool used to create the PDF is ActiveReports from DataDynamics. Once the PDF is created, here is the code to send it down:
Response.ClearContent()
Response.ClearHeaders()
Response.AddHeader("cache-control", "max-age=1")
Response.ContentType = "application/pdf"
Response.AddHeader("content-disposition", "attachment; filename=statement.pdf")
Response.AddHeader("content-length", mem_stream.Length.ToString)
Response.BinaryWrite(mem_stream.ToArray())
Response.Flush()
Response.End()
Note: If I don't explicitly specify cache-control then .NET sends no-cache on my behalf, so I have tried setting cache-control to: private or public or maxage=#, but none of those seem to work.
Here is the twist: when I run Fiddler to inspect the response headers, everything works fine. The headers that I receive are:
HTTP/1.1 200 OK
Cache-Control: max-age=1
Date: Wed, 29 Jul 2009 17:57:58 GMT
Content-Type: application/pdf
Server: Microsoft-IIS/6.0
MicrosoftOfficeWebServer: 5.0_Pub
X-Powered-By: ASP.NET
X-AspNet-Version: 2.0.50727
content-disposition: attachment; filename=statement.pdf
Content-Encoding: gzip
Vary: Accept-Encoding
Transfer-Encoding: chunked
As soon as I turn Fiddler off and try again, it fails again. One other thing that I noticed is that when Fiddler is running I get a There is a problem with this website's security certificate warning message, and I have to click Continue to this website (not recommended) to get through. When Fiddler is off, I do not encounter this security warning and it fails right away.
I am curious what is happening between Fiddler and the browser so that it works when Fiddler is running but breaks when it's not, but more importantly, does anyone have any ideas how I could change my code so streaming PDFs to IE will work without making changes to the client machine?
Update: The Fiddler issues are resolved, thank you very much EricLaw, so now it behaves consistently (broken, with or without Fiddler running).
Based on Google searching, there seem to be plenty of reports of this same issue all over the web, each with it's own specific combination of response headers that seem to fix the problem for their individual cases. I've tried many of these suggestions, including adding an ETag, LastModified date, removing the Vary header (using Fiddler) and dozens of combinations of the Cache-Control and/or Pragma headers. I tried "Content-Transfer-Encoding: binary" as well as "application/force-download" for the ContentType. Nothing has helped so far. There are a few Microsoft KB articles, all of which indicate that Cache-Control: no-cache is the culprit. Any other ideas?
Update: By the way, for completeness, this same issue occurs with Excel and Word outputs as well.
Update: No progress has been made. I emailed the .SAZ file from Fiddler to EricLaw and he was able to reproduce the problem when debugging IE, but there are no solutions yet. Bounty is going to expire...
Your Cache-Control header is incorrect. It should be Cache-Control: max-age=1 with the dash in the middle. Try fixing that first to see if it makes a difference.
Typically, I would say that the most likely culprit is your Vary header, as such headers often cause problems with caching in IE: http://blogs.msdn.com/ieinternals/archive/2009/06/17/9769915.aspx. You might want to try adding a ETAG to the response headers.
Fiddler should have no impact on cacheability (unless you've written rules), and it sounds like you're saying that it does, which suggests that perhaps there's a timing problem of some sort.
>Do not save encrypted pages to disk security option used to be disabled by default
This option is still disabled by default (in IE6, 7, and 8), although IT Administrators can turn it on via Group Policy, and some major companies do so.
Incidentally, the reason you see the certificate error while running Fiddler is that you haven't elected to trust the Fiddler root certificate; see http://www.fiddler2.com/fiddler/help/httpsdecryption.asp for more on this topic.
After two weeks on a wild goose chase, I have not been able to find any combination of code changes that will allow this method of streaming PDF, Excel or Word documents when the 'Do not save encrypted pages to disk' option is turned on.
Microsoft has said this behavior is by design in a number of KB articles and private emails. It appears that when the 'Do not save encrypted pages to disk' option is turned on that IE is behaving correctly and doing what it is told to do. This post is the best resource I have found so far that explains why this setting would be enabled and the Pros and Cons of enabling it:
"The 'Do not save encrypted pages to disk' comes into play when dealing with SSL (HTTPS) connections. Just like a web server can send done information about how to cache a file one can basically set Internet Explorer up to not save files to the cache during an SSL (HTTPS) connection regardless if the web server advises you can.
What is the upside for turning this feature on, security is the number one reason why the feature is turned on. Pages are not stored in the Temporary Internet Files cache.
What is the downside? Slow performance, since nothing is saved to the cache even that 1 byte gif image used a dozen times on the page must be fetched from the webserver each time. To make matters worse some user actions may fail such as downloaded files will be deleted and an error presented or opening PDF documents will fail to name a few scenarios."
The best solution we can find at this point is to communicate to our clients and users that alternatives exist to using this setting:
"Use 'Empty Temporary Internet Files folder when browser is closed'. Each time the browser closes all files will be purged from the cache assuming there is not a lock on a file from another instance of the browser or some external application.
A lot of consideration needs to be given before utilizing 'Do not save encrypted pages to disk'. Sounds like a great security feature and it is but the results of using this feature may cause your help desk calls to go up for download failures or slow performance."
I found that this seemed to work for me:
Dim browser As System.Web.HttpBrowserCapabilities = Request.Browser
If (browser.Browser = "IE") Then
Response.AppendHeader("cache-control", "private") ' ie only
Else
Response.AppendHeader("cache-control", "no-cache") ' all others (FF/Chrome tested)
End If
I had a similar problem with PDF files I wanted to stream. Even with Response.ClearHeaders() I saw Pragma and Cache-Control headers added at runtime. The solution was to clear the headers in IIS (Right-click -> Properties on the page loading the PDF, then "Http headers" tab).
SOLVED: This is an IE problem, not from aplication...
fix it with this: http://support.microsoft.com/kb/323308
It work perfect for me, after try for a long time.
ATT: Mr.Dark
We have faced a similar problem long time back - what we did was we (this is Java EE). In the web application config we add
<mime-mapping>
<extension>PDF</extension>
<mime-type>application/octet-stream</mime-type>
</mime-mapping>
This will make any pdf coming from your web application to be downloaded instead of the browser trying to render.
EDIT: looks like you are streaming it. In that case you will use a mime-type as application/octet-stream in your code and not in the config. So here instead of
Response.ContentType = "application/pdf"
you will use
Response.ContentType = "application/octet-stream"
What version of IE? I recall that Microsoft released a Hotfix for IE6 for this issue. Hope that is of some use?
I read of your Cache-control goose chase, but I'll share mine, that met my needs, in case it helps.
try to disable gzip compression.
Adding it here hoping that someone might find this useful instead going through the links.
Here is my code
byte[] bytes = // get byte array from DB
Response.Clear();
Response.ClearContent();
Response.ClearHeaders();
Response.Buffer = true;
// Prevent this page from being cached.
// NOTE: we cannot use the CacheControl property, or set the PRAGMA header value due to a flaw re: PDF/SSL/IE
Response.Expires = -1;
Response.ContentType = "application/pdf";
// Specify the number of bytes to be sent
Response.AppendHeader("content-length", bytes.Length.ToString());
Response.BinaryWrite(bytes);
// Wrap Up
Response.Flush();
Response.Close();
Response.End();
Like the OP I was scratching my head for days trying to get this to work, but I did it in the end so I thought I'd share my 'combination' of headers:
if (System.Web.HttpContext.Current.Request.Browser.Browser == "InternetExplorer"
&& System.Web.HttpContext.Current.Request.Browser.Version == "8.0")
{
System.Web.HttpContext.Current.Response.Clear();
System.Web.HttpContext.Current.Response.ClearContent();
System.Web.HttpContext.Current.Response.ClearHeaders();
System.Web.HttpContext.Current.Response.ContentType = "application/octet-stream";
System.Web.HttpContext.Current.Response.AppendHeader("Pragma", "public");
System.Web.HttpContext.Current.Response.AppendHeader("Cache-Control", "private, max-age=60");
System.Web.HttpContext.Current.Response.AppendHeader("Content-Transfer-Encoding", "binary");
System.Web.HttpContext.Current.Response.AddHeader("content-disposition", "attachment; filename=" + document.Filename);
System.Web.HttpContext.Current.Response.AddHeader("content-length", document.Data.LongLength.ToString());
System.Web.HttpContext.Current.Response.BinaryWrite(document.Data);
}
Hope that saves someone somewhere some pain!
I was running into a similar issue with attempting to stream a PDF over SSL and put this inside an iframe or object. I was finding that my aspx page would keep redirecting to the non-secure version of the URL, and the browser would block it.
I found switching from an ASPX page to a ASHX handler fixed my redirection problem.