Why is Response.BufferOutput = False, not working? - asp.net

This problem started on a different board, but Dave Ward, who was very prompt and helpful there is also here, so I'd like to pick up here for hopefully the last remaining piece of the puzzle.­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­
Basically, I was looking for a way to do constant updates to a web page from a long process. I thought AJAX was the way to go, but Dave has a nice article about using JavaScript. I integrated it into my application and it worked great on my client, but NOT my server WebHost4Life. I have another server # Brinkster and decided to try it there and it DOES work. All the code is the same on my client, WebHost4Life, and Brinkster, so there's obviously something going on with WebHost4Life.
I'm planning to write an email to them or request technical support, but I'd like to be proactive and try to figure out what could be going on with their end to cause this difference. I did everything I could with my code to turn off Buffering like Page.Response.BufferOutput = False. What server settings could they have implemented to cause this difference? Is there any way I could circumvent it on my own without their help? If not, what would they need to do?
For reference, a link to the working version of a simpler version of my application is located # http://www.jasoncomedy.com/javascriptfun/javascriptfun.aspx and the same version that isn't working is located # http://www.tabroom.org/Ajaxfun/Default.aspx. You'll notice in the working version, you get updates with each step, but in the one that doesn't, it sits there for a long time until everything is done and then does all the updates to the client at once ... and that makes me sad.

Hey, Jason. Sorry you're still having trouble with this.
What I would do is set up a simple page like:
protected void Page_Load(object sender, EventArgs e)
{
for (int i = 0; i < 10; i++)
{
Response.Write(i + "<br />");
Response.Flush();
Thread.Sleep(1000);
}
}
As we discussed before, make sure the .aspx file is empty of any markup other than the #Page declaration. That can sometimes trigger page buffering when it wouldn't have normally happened.
Then, point the tech support guys to that file and describe the desired behavior (10 updates, 1 per second). I've found that giving them a simple test case goes a long way toward getting these things resolved.
Definitely let us know what it ends up being. I'm guessing some sort of inline caching or reverse proxy, but I'm curious.

I don't know that you can force buffering - but a reverse proxy server between you and the server would affect buffering (since the buffer then affects the proxy's connection - not your browser's).

I've done some fruitless research on this one, but i'll share my line of thinking in the dim hope that it helps.
IIS is one of the things sitting between client and server in this case, so it might be useful to know what version of IIS is involved in each case -- and to investigate if there's some way that IIS can perform its own buffering on an open connection.
Though it's not quite on the money, this article about IIS6 v IIS 5 is the kind of thing I'm thinking of.

You should make sure that neither IIS nor any other filter is trying to compress your response. It is very possible that your production server has IIS compression enabled for dynamic pages such as those with the .aspx suffix, and your development server does not.
If this is the case, IIS may be waiting for the entire response (or a sizeable chunk) before it attempts to compress and send any result back to the client.
I suggest using Fiddler to monitor the response from your production server and figure out if responses are being gzip'd.
If response compression does turn out to be the problem, you can instruct IIS to ignore compression for specific responses via the Content-Encoding:Identity header.

The issue is that IIS will further buffer output (beyond ASP.NET's buffering) if you have dynamic gzip compression turned on (it is by default these days).
Therefore to stop IIS buffering your response there's a little hack you can do to fool IIS into thinking that the client can't handle compression by overwriting the Request.Headers["Accept-Encoding"] header (yes, Request.Headers, trust me):
Response.BufferOutput = false;
Request.Headers["Accept-Encoding"] = ""; // suppresses gzip compression on output
As it's sending the response, the IIS compression filter checks the request headers for Accept-Encoding: gzip ... and if it's not there, doesn't compress (and therefore further buffer the output).

Related

ASP -response-flush-flushes-partial-data

I am developing a web app with an ASP server side and I use an iframe for data push.
An ASP handler flushes every once in a while some javascript to the iframe:
context.Response.Write("<script language='javascript'>top.update('lala');</script>");
context.Response.Flush();
My problem is that sometimes, when I receive the data, I don't get the full text. For example I will receive this : update('lala');
One workaround I have is to have a thread flushing '..........' every 500ms. (Then I will receive script>...... which will complete my javascript.)
However I am sure there must be a way to have Response.Flush() sending the whole chunk of data. Does someone have an idea on how to use properly Response.Flush() ?
Thank you!
Apparently after tons of google searches, I found the answer. The IIS server was compressing the output with GZIP, then it will seem to ignore all Response.Flush calls.
This is turned on by default in IIS7 and on Windows 7. If you disable it, it works fine.

How to deliver big files in ASP.NET Response?

I am not looking for any alternative of streaming file contents from
database, indeed I am looking for root of the problem, this was
running file till IIS 6 where we ran our app in classic mode, now we
upgraded our IIS to 7 and we are running app pool in pipeline mode and
this problem started.
I have an handler, where I have to deliver big files to client request. And I face following problems,
Files are of average size 4 to 100 MB, so lets consider 80MB file download case.
Buffering On, Slow Start
Response.BufferOutput = True;
This results in very slow start of file, as user downloads and even progress bar does not appear till few seconds, typically 3 to 20 seconds, reason behind is, IIS reads entire file first, determines the content-length and then begin the file transfer. File is being played in video player, and it runs very very slow, however iPad only downloads fraction of file first so it works fast.
Buffering Off, No Content-Length, Fast Start, No Progress
Reponse.BufferOutput = False;
This results in immediate start, however end client (typical browser like Chrome) does not know Content-Length as IIS does not know either, so it does not display progress, instead it says X KB downloaded.
Buffering Off, Manual Content-Length, Fast Start, Progress and Protocol Violation
Response.BufferOutput = False;
Response.AddHeader("Content-Length", file.Length);
This results in correct immediate file download in Chrome etc, however in some cases IIS handler results in "Remote Client Closed Connection" error (this is very frequent) and other WebClient results in protocol violation. This happens 5 to 10% of all requests, not every requests.
I guess what is happening is, IIS does not send anything called 100 continue when we dont do buffering and client might disconnect not expecting any output. However, reading files from source may take longer time, but at client side I have increased timeout but seems like IIS timesout and have no control.
Is there anyway I can force Response to send 100 continue and not let anyone close the connection?
UPDATE
I found following headers in Firefox/Chrome, nothing seems unusual here for Protocol Violation or Bad Header.
Access-Control-Allow-Headers:*
Access-Control-Allow-Methods:POST, GET, OPTIONS
Access-Control-Allow-Origin:*
Access-Control-Max-Age:1728000
Cache-Control:private
Content-Disposition:attachment; filename="24.jpg"
Content-Length:22355
Content-Type:image/pjpeg
Date:Wed, 07 Mar 2012 13:40:26 GMT
Server:Microsoft-IIS/7.5
X-AspNet-Version:4.0.30319
X-Powered-By:ASP.NET
UPDATE 2
Turning Recycling still did not offer much but I have increased my MaxWorkerProcess to 8 and I now get less number of errors then before.
But on an average, out of 200 requests in one second, 2 to 10 requests fail.., and this happens on almost every alternate seconds.
UPDATE 3
Continuing 5% of requests failing with "The server committed a protocol violation. Section=ResponseStatusLine", I have another program that downloads content from the webserver which uses WebClient, and which gives this error 4-5 times a second, on an average I have 5% of requests failing. Is there anyway to trace WebClient failure?
Problems Redefined
Zero Byte File Received
IIS closes connection for some reason, on client side in WebConfig, I receive 0 bytes for the file which is not zero bytes, We do SHA1 hash check, this told us that in IIS web server, no error is recorded.
This was my mistake, and its resolved as we are using Entity Framework, it was reading dirty (uncommitted rows) as read was not in transaction scope, putting it in transaction scope has resolved this issue.
Protocol Violation Exception Raised
WebClient throws WebException saying "The server committed a protocol violation. Section=ResponseStatusLine.
I know I can enable unsafe header parsing but that is not the point, when it is my HTTP Handler that is sending proper headers, dont know why IIS is sending anything extra (checked on firefox and chrome, nothing unusual), this happens only 2% of times.
UPDATE 4
Found sc-win32 64 error and I read somewhere that WebLimits for MinBytesPerSecond must be changed from 240 to 0, still everything is same. However I have noticed that whenever IIS logs 64 sc-win32 error, IIS records HTTP Status as 200 but there was some error. Now I cant turn on Failed Trace Logging for 200 because it will result in massive files.
Both of above problems were solved by increasing MinBytesPerSecond and as well as disabling Sessions, I have added detailed answer summarizing every point.
When you have set the content length with the bufferOutput to false then the possible reason of the fails is because IIS try to gzip the file you send, and by set the Content-Length IIS can not change it back to the compressed one, and the errors starts (*).
So keep the BufferOutput to false, and second disable the gzip from iis for the files you send - or disable the iis gzip for all files and you handle the gzip part programmatically, keeping out of gzip the files you send.
Some similar questions for the same reason:
ASP.NET site sometimes freezing up and/or showing odd text at top of the page while loading, on load balanced servers
HTTP Compression: Some external scripts/CSS not decompressing properly some of the time
(*) why not change it again ? because from the moment you set a header you can not take it back, except if you have enable this option on IIS and take care that the header have not all ready send to the browser.
Follow up
If not gziped, the next thing it came to my mind is that the file is sent and for some reason the connection got delayed, and got a timeout and closed. So you get the "Remote Host Closed The Connection".
This can be solved depending on the cause:
Client really closed the connection
The timeout is from the page itself, if you use handler (again, probably, the message must be "Page Timed Out" ).
The timeout is coming from the idle waiting, the page take more than the execution time, gets a timeout and close the connection. Maybe in this case the message was the Page Timed Out.
The pool make a recycle the moment you send the file. Disable all pool recycles! This is the most possible cases that I can think of right now.
If it is coming from the IIS, go to the web site properties and make sure you set the biggest "Connection Timeout", and "Enable HTTP Keep-Alives".
The page timeout by changing the web.config (you can change it programmatically only for one specific page)
<httpRuntime executionTimeout="43200"
Also have a look at :
http://weblogs.asp.net/aghausman/archive/2009/02/20/prevent-request-timeout-in-asp-net.aspx
Session lock
One more thing that you need to examine is to not use session on the handler that you use to send the file, because the session locks the action until finish out and if a user take longer time to download a file, a second one may get time out.
some relative:
call aspx page to return an image randomly slow
Replacing ASP.Net's session entirely
Response.WriteFile function fails and gives 504 gateway time-out
Although correct way to deliver the big files in IIS is the following option,
Set MinBytesPerSecond to Zero in WebLimits (This will certainly help in improving performance, as IIS chooses to close clients holding KeepAlive connections with smaller size transfers)
Allocate More Worker Process to Application Pool, I have set to 8, now this should be done only if your server is distributing larger files. This will certainly cause other sites to perform slower, but this will ensure better deliveries. We have set to 8 as this server has only one website and it just delivers huge files.
Turn off App Pool Recycling
Turn off Sessions
Leave Buffering On
Before each of following steps, check if Response.IsClientConnected is true, else give up and dont send anything.
Set Content-Length before sending the file
Flush the Response
Write to Output Stream, and Flush in regular intervals
What I would do is use the not so well-known ASP.NET Response.TransmitFile method, as it's very fast (and possibly uses IIS kernel cache) and takes care of all header stuff. It is based on the Windows unmanaged TransmitFile API.
But to be able to use this API, you need a physical file to transfer. So here is a pseudo c# code that explain how to do this with a fictional myCacheFilePath physical file path. It also supports client caching possibilities. Of course, if you already have a file at hand, you don't need to create that cache:
if (!File.Exists(myCacheFilePath))
{
LoadMyCache(...); // saves the file to disk. don't do this if your source is already a physical file (not stored in a db for example).
}
// we suppose user-agent (browser) cache is enabled
// check appropriate If-Modified-Since header
DateTime ifModifiedSince = DateTime.MaxValue;
string ifm = context.Request.Headers["If-Modified-Since"];
if (!string.IsNullOrEmpty(ifm))
{
try
{
ifModifiedSince = DateTime.Parse(ifm, DateTimeFormatInfo.InvariantInfo);
}
catch
{
// do nothing
}
// file has not changed, just send this information but truncate milliseconds
if (ifModifiedSince == TruncateMilliseconds(File.GetLastWriteTime(myCacheFilePath)))
{
ResponseWriteNotModified(...); // HTTP 304
return;
}
}
Response.ContentType = contentType; // set your file content type here
Response.AddHeader("Last-Modified", File.GetLastWriteTimeUtc(myCacheFilePath).ToString("r", DateTimeFormatInfo.InvariantInfo)); // tell the client to cache that file
// this API uses windows lower levels directly and is not memory/cpu intensive on Windows platform to send one file. It also caches files in the kernel.
Response.TransmitFile(myCacheFilePath)
This piece of code works for me.
It starts the data stream to client immediately.
It shows progress during download.
It doesn't violate HTTP. Content-Length header is specified and the chuncked transfer encoding is not used.
protected void PrepareResponseStream(string clientFileName, HttpContext context, long sourceStreamLength)
{
context.Response.ClearHeaders();
context.Response.Clear();
context.Response.ContentType = "application/pdf";
context.Response.AddHeader("Content-Disposition", string.Format("filename=\"{0}\"", clientFileName));
//set cachebility to private to allow IE to download it via HTTPS. Otherwise it might refuse it
//see reason for HttpCacheability.Private at http://support.microsoft.com/kb/812935
context.Response.Cache.SetCacheability(HttpCacheability.Private);
context.Response.Buffer = false;
context.Response.BufferOutput = false;
context.Response.AddHeader("Content-Length", sourceStreamLength.ToString (System.Globalization.CultureInfo.InvariantCulture));
}
protected void WriteDataToOutputStream(Stream sourceStream, long sourceStreamLength, string clientFileName, HttpContext context)
{
PrepareResponseStream(clientFileName, context, sourceStreamLength);
const int BlockSize = 4 * 1024 * 1024;
byte[] buffer = new byte[BlockSize];
int bytesRead;
Stream outStream = m_Context.Response.OutputStream;
while ((bytesRead = sourceStream.Read(buffer, 0, BlockSize)) > 0)
{
outStream.Write(buffer, 0, bytesRead);
}
outStream.Flush();
}

Does Page Unload get called after the response has left IIS?

I'm doing some diagnostic logging in the Page_Unload event in an asp.net application, this logging can take a fair bit of time (about 100ms). Will the response stream get held up by the code in the Page Unload handler? I could do my work asynchronously by using the theadpool but I'd rather not if it won't affect the client's response time.
More information:
#thorkia is correct in that the documentation says that Page_Unload is called after the response is sent to the client, but in my testing (as advised by #steve) it does block. I've tried Casini, IIS Express, Full IIS 7.5 (on a test server) with both release and debug builds, with and without a debugger attached. And, grasping at straws, I tried putting Async=true in the Page Directive. I tried with Fiddler (streaming enabled), and without Fiddler. I've tried with IE9, and Firefox. If the documentation is "correct" then I wonder it it does send the response but perhaps doesn't "finish it off" (what ever that means I'll need to check the HTTP spec) and so the page doesn't render in the browser? But my understanding was that a client browser starts to render the page as it receives the bytes to this doesn't make sense to me either. I've also tried looking at the code in IL Spy but I think this might take me a lot of time.
Now I'm intrigued; am I doing something wrong, or is the documentation misleading?
Why not try it?
protected void Page_UnLoad(object sender, EventArgs e)
{
System.Diagnostics.Debug.WriteLine("In Page_UnLoad");
System.Threading.Thread.Sleep(10000);
System.Diagnostics.Debug.WriteLine("Leaving Page_UnLoad");
}
According to MSDN (http://msdn.microsoft.com/en-us/library/ms178472.aspx) the Page Unload stage is only called after the data has been sent to the client.
Taking a long time to do your logging and clean up will not affect the clients response time for that request but could affect future requests if lots of pages are waiting to be unloaded.

Setting optimum http caching headers and server params in ASP.Net MVC and IIS 7.5

I have an ASP.Net site (happens to be MVC, but that's not relevant here) with a few pages I'd like cached really well.
Specifically I'd like to achieve:
output cached on the server for 2 hours.
if the file content on the server changes, that output cache should be flushed for that page
cached in the browser for 10 minutes (i.e. don't even ask the server if it's that fresh)
when the browser does make an actual subsequent request, I'd like it to use etags, so that the server can return a 304 if not modified.
(note - time values above are indicative examples only)
1) and 2) I can achieve by Response.Cache.SetCacheability(HttpCacheability.Server)
I know 3) can be achieved by using max-age and cache-control:private
I can emit etags with Response.Cache.SetETagFromFileDependencies();
but I can't seem to get all of these things to work together. Here's what I have:
Response.Cache.SetCacheability(HttpCacheability.ServerAndPrivate);
Response.Cache.SetRevalidation(HttpCacheRevalidation.AllCaches);
Response.Cache.SetETagFromFileDependencies();
Response.Cache.SetValidUntilExpires(true);
Response.Cache.SetMaxAge(TimeSpan.FromSeconds(60 * 10));
Is the scenario I want possible? In particular:
can browsers do both 3) and 4) like that? When Firefox issues a new request after it expires in the local cache, it does indeed send the etag the server responded with before, but I get a 200 response.
setting the variables like above, where would I set the duration of the output caching?
Thanks for any tips!
I'm not sure if you've solved this problem yet (several months later...), but this should be possible.
SetMaxAge should set the amount of "guarranteed" fresh time. If you additionally send an ETag, you'll have satisfied 3 & 4. Requirements 1 & 2 can be solved orthogonally with whatever server-side caching mechanism you use: I've never used ASP.NET server-side cache like this, but it's almost certainly possible.
I'd remove extraneous headers from your responses such as SetRevalidation - why would this be necessary?

ASP.NET application exhibits strange behaviour through firewall

This problem has been solved thanks to your suggestions. See the bottom for details. Thanks very much for your help!
Our ASP.NET website is accessed from several specific and highly secure international locations. It has been operating fine, but we have added another client location which is exhibiting very strange behaviour.
In particular, when the user enters search criteria and clicks the search button the result list returns empty. It doesn't even show the '0 results returned' text, so it is as if the Repeater control did not bind at all. Similar behaviour appears in some, but not all, other parts of the site. The user is able to log in to the site fine and their profile information is displayed.
I have logged in to the site locally using exactly the same credentials as them and the site works well from here. We have gone through the steps carefully so I am confident it is not a user issue.
I bind the search results in the Page_Load of the search results page the first time it is loaded (the criteria is in the query string). i.e.
if (!IsPostBack) {
BindResults();
}
I can replicate exactly the same behaviour locally by commenting out the BindResults() method call.
Does anybody know how the value of IsPostBack is calculated? Is it possible that their highly-secure firewall setup would cause IsPostBack to always return true, even when it is a redirect from another page? That could be a red herring as the problem might be elsewhere. It does exactly replicate the result though.
I have no access to the site, so troubleshooting is restricted to giving them instructions and asking for them to tell me the result.
Thanks for your time!
Appended info: Client is behind a Microsoft ISA 2006 firewall running default rules. The site has been added to the Internet Explorer trusted sites list and tried in FireFox and Google Chrome, all with the same result.
SOLUTION: The winner for me was the suggestion to use Fiddler. What an excellent tool that no web developer should be without. Using this I was able to strip various headers from the request until I reproduced the problem. There were actually two factors that caused this bug, as is so often the case with such confusing issues.
Factor one – Where possible the web application uses GZIP compression as supported by all major browsers. The firewall was stripping off the header that specifies GZIP decompression support (Accept-Encoding: gzip, deflate).
Factor two – A bug in my code meant that some processing was bypassed when the content was being sent uncompressed. This problem was not noticed before because the application is used by a limited audience, all of which supported GZIP decompression.
If they're at all tech-savvy, I would have them download Fiddler or something similar, capture the entire HTTP session, and then send you the saved session. Maybe something in there will stick out.
Meanwhile, see if you can get an install of ISA Server (an evaluation install, if you have to, or one from MSDN if you have or know anyone with a sub) and see if you can replicate it locally.
Is it possible the client has disabled Javascript and it's not picking up the _EVENTTARGET form value?
It might be some sort of proxy which creates a GET request out of a given POST request...
I am not sure how the IsPostBack is calculated, but my guess would be that it checks the HTTP request to see if it's a POST or a GET...
Ohh, yeah. It's definitely NOT "_EVENTTARGET" BTW...
I know this since Ra-Ajax does NOT pass any of those parameters to the server and they (Ra-ajax requests) are processed as IsPostBack requests...
Location, location, location. Check the user's culture. Normally that causes issues.
Could you create a test Post Page that passes the same things that your search page does, and in the Page_Load write back all of the post to make sure they are getting passed, particularly the __VIEWSTATE.
foreach (string key in Request.Form)
{
Response.Write("<br>" + key + "=" + Request.Form[key]);
}
Then ask one of the users to forward back what they see on that test page.
EDIT: There is documentation that some firewalls can corrupt the VIEWSTATE and some methods to get around it: View State Overview
Check the IIS logs to see if the request even makes it to your server. The ISA setup might be caching the initial request and serving that up in the succeeding requests.

Resources