I am developing a web app with an ASP server side and I use an iframe for data push.
An ASP handler flushes every once in a while some javascript to the iframe:
context.Response.Write("<script language='javascript'>top.update('lala');</script>");
context.Response.Flush();
My problem is that sometimes, when I receive the data, I don't get the full text. For example I will receive this : update('lala');
One workaround I have is to have a thread flushing '..........' every 500ms. (Then I will receive script>...... which will complete my javascript.)
However I am sure there must be a way to have Response.Flush() sending the whole chunk of data. Does someone have an idea on how to use properly Response.Flush() ?
Thank you!
Apparently after tons of google searches, I found the answer. The IIS server was compressing the output with GZIP, then it will seem to ignore all Response.Flush calls.
This is turned on by default in IIS7 and on Windows 7. If you disable it, it works fine.
Using .NET 4.0 on IIS 7.5 on Windows 2008 R2.
I would like to output a binary content which represents various types of documents (images, PDF, Office files, etc). Let's assume the entire content is already in a MemoryStream, I would like to output it by:
Response.Clear();
Response.AddHeader("Content-Disposition", string.Format("attachment; filename={0}", fileNameSaveAs));
Response.AddHeader("Content-Length", memoryStr.Length.ToString());
Response.ContentType = "application/octet-stream";
Response.OutputStream.Write(memoryStr.ToArray(), 0, (int) memoryStr.Length);
Response.Flush();
The code above is not reliable. There are often file corruption. The clients using various browsers, sometimes have an aborted download, sometimes download a file which is unreadable. The likelihood of having a corruption increases with the file size. Using fiddler, we found out that the response header reported a content length different than the original file size. So for a quick test we commented out the line Response.AddHeader("Content-Length" ...) and the corruption issue disappeared.
Q1: Is this issue caused by the Dynamic Compression (enabled on IIS7 by default)?
Q2: If answer to Q1 is yes, then is there any elegant solution to inform the client about the Content-Length?
Q3: Removing the "Content-Length" header seems to affect the ability of the client to save file as. Example: "Content-Disposition", is initalized with fileNameSaveAs = "One Two Three.pdf". Using Firefox, when receiving the file, the download dialog defaulted to "One" as filename. Is it a normal consequence?
Thanks in advance for any help.
made more tests, clarified a few things but not technically satisfactory.
A1. IIS 7.5 Dynamic Compression is not the cause. Download corruption still occurred whether Dynamic Compression, Static Compression, or both is disabled. As soon as the line Response.AddHeader("Content-Length" ... is commented out in the code. All download issue disappeared.
A2. no idea! I really would like to know.
A3. This behavior is probably a Firefox bug. This has nothing to do with the "Content-Length" header.
I have an application that allows the user to download a csv. This works great when not using SSL. However, when using a secure connection I get the "Internet Explorer was not able to open this Internet Site". I know that the problem has to do with my HttpCaching policy. The following MS Support document explains the issue: http://support.microsoft.com/kb/316431 However, I cannot seem to get it to work. Any ideas?
HttpCachePolicy cachePolicy = Response.Cache;
cachePolicy.SetCacheability(HttpCacheability.Private);
cachePolicy.SetNoStore();
cachePolicy.SetMaxAge(new TimeSpan(0L));
cachePolicy.SetRevalidation(HttpCacheRevalidation.AllCaches);
I have tried a combination of different HttpCacheability types.
I found that clearing the headers before explicitly setting the Cacheability resolved this issue.
Response.Clearheaders();
I am streaming a PDF to the browser in ASP.NET 2.0. This works in all browsers over HTTP and all browsers except IE over HTTPS. As far as I know, this used to work (over the past 5 years or so) in all versions of IE, but our clients have only recently started to report issues. I suspect the Do not save encrypted pages to disk security option used to be disabled by default and at some point became enabled by default (Internet Options -> Advanced -> Security). Turning this option off helps, as a work-around, but is not viable as a long term solution.
The error message I am receiving is:
Internet Explorer cannot download OutputReport.aspx from www.sitename.com.
Internet Explorer was not able to open this Internet site. The requested site is either unavailable or cannot be found. Please try again later.
The tool used to create the PDF is ActiveReports from DataDynamics. Once the PDF is created, here is the code to send it down:
Response.ClearContent()
Response.ClearHeaders()
Response.AddHeader("cache-control", "max-age=1")
Response.ContentType = "application/pdf"
Response.AddHeader("content-disposition", "attachment; filename=statement.pdf")
Response.AddHeader("content-length", mem_stream.Length.ToString)
Response.BinaryWrite(mem_stream.ToArray())
Response.Flush()
Response.End()
Note: If I don't explicitly specify cache-control then .NET sends no-cache on my behalf, so I have tried setting cache-control to: private or public or maxage=#, but none of those seem to work.
Here is the twist: when I run Fiddler to inspect the response headers, everything works fine. The headers that I receive are:
HTTP/1.1 200 OK
Cache-Control: max-age=1
Date: Wed, 29 Jul 2009 17:57:58 GMT
Content-Type: application/pdf
Server: Microsoft-IIS/6.0
MicrosoftOfficeWebServer: 5.0_Pub
X-Powered-By: ASP.NET
X-AspNet-Version: 2.0.50727
content-disposition: attachment; filename=statement.pdf
Content-Encoding: gzip
Vary: Accept-Encoding
Transfer-Encoding: chunked
As soon as I turn Fiddler off and try again, it fails again. One other thing that I noticed is that when Fiddler is running I get a There is a problem with this website's security certificate warning message, and I have to click Continue to this website (not recommended) to get through. When Fiddler is off, I do not encounter this security warning and it fails right away.
I am curious what is happening between Fiddler and the browser so that it works when Fiddler is running but breaks when it's not, but more importantly, does anyone have any ideas how I could change my code so streaming PDFs to IE will work without making changes to the client machine?
Update: The Fiddler issues are resolved, thank you very much EricLaw, so now it behaves consistently (broken, with or without Fiddler running).
Based on Google searching, there seem to be plenty of reports of this same issue all over the web, each with it's own specific combination of response headers that seem to fix the problem for their individual cases. I've tried many of these suggestions, including adding an ETag, LastModified date, removing the Vary header (using Fiddler) and dozens of combinations of the Cache-Control and/or Pragma headers. I tried "Content-Transfer-Encoding: binary" as well as "application/force-download" for the ContentType. Nothing has helped so far. There are a few Microsoft KB articles, all of which indicate that Cache-Control: no-cache is the culprit. Any other ideas?
Update: By the way, for completeness, this same issue occurs with Excel and Word outputs as well.
Update: No progress has been made. I emailed the .SAZ file from Fiddler to EricLaw and he was able to reproduce the problem when debugging IE, but there are no solutions yet. Bounty is going to expire...
Your Cache-Control header is incorrect. It should be Cache-Control: max-age=1 with the dash in the middle. Try fixing that first to see if it makes a difference.
Typically, I would say that the most likely culprit is your Vary header, as such headers often cause problems with caching in IE: http://blogs.msdn.com/ieinternals/archive/2009/06/17/9769915.aspx. You might want to try adding a ETAG to the response headers.
Fiddler should have no impact on cacheability (unless you've written rules), and it sounds like you're saying that it does, which suggests that perhaps there's a timing problem of some sort.
>Do not save encrypted pages to disk security option used to be disabled by default
This option is still disabled by default (in IE6, 7, and 8), although IT Administrators can turn it on via Group Policy, and some major companies do so.
Incidentally, the reason you see the certificate error while running Fiddler is that you haven't elected to trust the Fiddler root certificate; see http://www.fiddler2.com/fiddler/help/httpsdecryption.asp for more on this topic.
After two weeks on a wild goose chase, I have not been able to find any combination of code changes that will allow this method of streaming PDF, Excel or Word documents when the 'Do not save encrypted pages to disk' option is turned on.
Microsoft has said this behavior is by design in a number of KB articles and private emails. It appears that when the 'Do not save encrypted pages to disk' option is turned on that IE is behaving correctly and doing what it is told to do. This post is the best resource I have found so far that explains why this setting would be enabled and the Pros and Cons of enabling it:
"The 'Do not save encrypted pages to disk' comes into play when dealing with SSL (HTTPS) connections. Just like a web server can send done information about how to cache a file one can basically set Internet Explorer up to not save files to the cache during an SSL (HTTPS) connection regardless if the web server advises you can.
What is the upside for turning this feature on, security is the number one reason why the feature is turned on. Pages are not stored in the Temporary Internet Files cache.
What is the downside? Slow performance, since nothing is saved to the cache even that 1 byte gif image used a dozen times on the page must be fetched from the webserver each time. To make matters worse some user actions may fail such as downloaded files will be deleted and an error presented or opening PDF documents will fail to name a few scenarios."
The best solution we can find at this point is to communicate to our clients and users that alternatives exist to using this setting:
"Use 'Empty Temporary Internet Files folder when browser is closed'. Each time the browser closes all files will be purged from the cache assuming there is not a lock on a file from another instance of the browser or some external application.
A lot of consideration needs to be given before utilizing 'Do not save encrypted pages to disk'. Sounds like a great security feature and it is but the results of using this feature may cause your help desk calls to go up for download failures or slow performance."
I found that this seemed to work for me:
Dim browser As System.Web.HttpBrowserCapabilities = Request.Browser
If (browser.Browser = "IE") Then
Response.AppendHeader("cache-control", "private") ' ie only
Else
Response.AppendHeader("cache-control", "no-cache") ' all others (FF/Chrome tested)
End If
I had a similar problem with PDF files I wanted to stream. Even with Response.ClearHeaders() I saw Pragma and Cache-Control headers added at runtime. The solution was to clear the headers in IIS (Right-click -> Properties on the page loading the PDF, then "Http headers" tab).
SOLVED: This is an IE problem, not from aplication...
fix it with this: http://support.microsoft.com/kb/323308
It work perfect for me, after try for a long time.
ATT: Mr.Dark
We have faced a similar problem long time back - what we did was we (this is Java EE). In the web application config we add
<mime-mapping>
<extension>PDF</extension>
<mime-type>application/octet-stream</mime-type>
</mime-mapping>
This will make any pdf coming from your web application to be downloaded instead of the browser trying to render.
EDIT: looks like you are streaming it. In that case you will use a mime-type as application/octet-stream in your code and not in the config. So here instead of
Response.ContentType = "application/pdf"
you will use
Response.ContentType = "application/octet-stream"
What version of IE? I recall that Microsoft released a Hotfix for IE6 for this issue. Hope that is of some use?
I read of your Cache-control goose chase, but I'll share mine, that met my needs, in case it helps.
try to disable gzip compression.
Adding it here hoping that someone might find this useful instead going through the links.
Here is my code
byte[] bytes = // get byte array from DB
Response.Clear();
Response.ClearContent();
Response.ClearHeaders();
Response.Buffer = true;
// Prevent this page from being cached.
// NOTE: we cannot use the CacheControl property, or set the PRAGMA header value due to a flaw re: PDF/SSL/IE
Response.Expires = -1;
Response.ContentType = "application/pdf";
// Specify the number of bytes to be sent
Response.AppendHeader("content-length", bytes.Length.ToString());
Response.BinaryWrite(bytes);
// Wrap Up
Response.Flush();
Response.Close();
Response.End();
Like the OP I was scratching my head for days trying to get this to work, but I did it in the end so I thought I'd share my 'combination' of headers:
if (System.Web.HttpContext.Current.Request.Browser.Browser == "InternetExplorer"
&& System.Web.HttpContext.Current.Request.Browser.Version == "8.0")
{
System.Web.HttpContext.Current.Response.Clear();
System.Web.HttpContext.Current.Response.ClearContent();
System.Web.HttpContext.Current.Response.ClearHeaders();
System.Web.HttpContext.Current.Response.ContentType = "application/octet-stream";
System.Web.HttpContext.Current.Response.AppendHeader("Pragma", "public");
System.Web.HttpContext.Current.Response.AppendHeader("Cache-Control", "private, max-age=60");
System.Web.HttpContext.Current.Response.AppendHeader("Content-Transfer-Encoding", "binary");
System.Web.HttpContext.Current.Response.AddHeader("content-disposition", "attachment; filename=" + document.Filename);
System.Web.HttpContext.Current.Response.AddHeader("content-length", document.Data.LongLength.ToString());
System.Web.HttpContext.Current.Response.BinaryWrite(document.Data);
}
Hope that saves someone somewhere some pain!
I was running into a similar issue with attempting to stream a PDF over SSL and put this inside an iframe or object. I was finding that my aspx page would keep redirecting to the non-secure version of the URL, and the browser would block it.
I found switching from an ASPX page to a ASHX handler fixed my redirection problem.
This problem started on a different board, but Dave Ward, who was very prompt and helpful there is also here, so I'd like to pick up here for hopefully the last remaining piece of the puzzle.
Basically, I was looking for a way to do constant updates to a web page from a long process. I thought AJAX was the way to go, but Dave has a nice article about using JavaScript. I integrated it into my application and it worked great on my client, but NOT my server WebHost4Life. I have another server # Brinkster and decided to try it there and it DOES work. All the code is the same on my client, WebHost4Life, and Brinkster, so there's obviously something going on with WebHost4Life.
I'm planning to write an email to them or request technical support, but I'd like to be proactive and try to figure out what could be going on with their end to cause this difference. I did everything I could with my code to turn off Buffering like Page.Response.BufferOutput = False. What server settings could they have implemented to cause this difference? Is there any way I could circumvent it on my own without their help? If not, what would they need to do?
For reference, a link to the working version of a simpler version of my application is located # http://www.jasoncomedy.com/javascriptfun/javascriptfun.aspx and the same version that isn't working is located # http://www.tabroom.org/Ajaxfun/Default.aspx. You'll notice in the working version, you get updates with each step, but in the one that doesn't, it sits there for a long time until everything is done and then does all the updates to the client at once ... and that makes me sad.
Hey, Jason. Sorry you're still having trouble with this.
What I would do is set up a simple page like:
protected void Page_Load(object sender, EventArgs e)
{
for (int i = 0; i < 10; i++)
{
Response.Write(i + "<br />");
Response.Flush();
Thread.Sleep(1000);
}
}
As we discussed before, make sure the .aspx file is empty of any markup other than the #Page declaration. That can sometimes trigger page buffering when it wouldn't have normally happened.
Then, point the tech support guys to that file and describe the desired behavior (10 updates, 1 per second). I've found that giving them a simple test case goes a long way toward getting these things resolved.
Definitely let us know what it ends up being. I'm guessing some sort of inline caching or reverse proxy, but I'm curious.
I don't know that you can force buffering - but a reverse proxy server between you and the server would affect buffering (since the buffer then affects the proxy's connection - not your browser's).
I've done some fruitless research on this one, but i'll share my line of thinking in the dim hope that it helps.
IIS is one of the things sitting between client and server in this case, so it might be useful to know what version of IIS is involved in each case -- and to investigate if there's some way that IIS can perform its own buffering on an open connection.
Though it's not quite on the money, this article about IIS6 v IIS 5 is the kind of thing I'm thinking of.
You should make sure that neither IIS nor any other filter is trying to compress your response. It is very possible that your production server has IIS compression enabled for dynamic pages such as those with the .aspx suffix, and your development server does not.
If this is the case, IIS may be waiting for the entire response (or a sizeable chunk) before it attempts to compress and send any result back to the client.
I suggest using Fiddler to monitor the response from your production server and figure out if responses are being gzip'd.
If response compression does turn out to be the problem, you can instruct IIS to ignore compression for specific responses via the Content-Encoding:Identity header.
The issue is that IIS will further buffer output (beyond ASP.NET's buffering) if you have dynamic gzip compression turned on (it is by default these days).
Therefore to stop IIS buffering your response there's a little hack you can do to fool IIS into thinking that the client can't handle compression by overwriting the Request.Headers["Accept-Encoding"] header (yes, Request.Headers, trust me):
Response.BufferOutput = false;
Request.Headers["Accept-Encoding"] = ""; // suppresses gzip compression on output
As it's sending the response, the IIS compression filter checks the request headers for Accept-Encoding: gzip ... and if it's not there, doesn't compress (and therefore further buffer the output).