XML Load TimeOut - asp.net

I am using below code to load a xml:
XmlDocument xdoc = new XmlDocument();
xdoc.Load("http://mydomain.com/video/list");
in normal situation it works fine, but some times i face a issue of response time out.
sometimes the url from which i wants to load my xml not response me and till that time my application also went timeout.
please tell me what should i do in such situation, so that either i can run my other code if it not responding me within 5 second or any other such solution in which i can do my code in case that url is not responding me xml file.
Thanks

You could try using a HttpWebRequest where you have the possibility to set the TimeOut for the request. In the case the remote resource doesn't response before this timeout value is reached an exception will be thrown which you could intercept and inform the user.

Related

Restlet Server not returning proper responses

I have a ServerResource object that is running within a component. Its purpose is to act in many ways like a basic HTTP server. It uses a Representation to acquire a file and return the file's contents to a browser.
The active function for this application is provided below:
public Representation showPage()
{
Representation rep = null;
if(fileName != null)
{
File path = new File("pages/" + fileName);
rep = new FileRepresentation(path,MediaType.ALL);
}
return(rep);
}
Note that "fileName" is the name of an HTML file (or index.html) which was previously passed in as an attribute. The files that this application serves are all in a subdirectory called "pages" as shown in the code. The idea is that a browser sends an HTTP request for an HTML file, and the server returns that file's contents in the same way that Apache would.
Note also that the restlet application is deployed as a JSE application. I am using Restlet 2.1.
An interesting problem occurs when accessing the application. Sometimes, when the request comes from a Firefox browser, the server simply does not send a response at all. The log output shows the request coming in, but the server sinply does not respond, not even with a 404. The browser waits for a response for a time, then times out.
When using Internet Explorer, sometimes the browser times out due to not receiving a response from the server, but sometimes the server also returns a 304 response. My research into this response indicates that it should not be returned at all -- especially if the HTML files have no- caching tags included.
Is there something in the code that is causing these non- responses??? Is there something missing that is causing the ServerResource object to handle responses so unreliably? Or have I found a bug in Restlet's response mechanisms?
Someone please advise...

How to deliver big files in ASP.NET Response?

I am not looking for any alternative of streaming file contents from
database, indeed I am looking for root of the problem, this was
running file till IIS 6 where we ran our app in classic mode, now we
upgraded our IIS to 7 and we are running app pool in pipeline mode and
this problem started.
I have an handler, where I have to deliver big files to client request. And I face following problems,
Files are of average size 4 to 100 MB, so lets consider 80MB file download case.
Buffering On, Slow Start
Response.BufferOutput = True;
This results in very slow start of file, as user downloads and even progress bar does not appear till few seconds, typically 3 to 20 seconds, reason behind is, IIS reads entire file first, determines the content-length and then begin the file transfer. File is being played in video player, and it runs very very slow, however iPad only downloads fraction of file first so it works fast.
Buffering Off, No Content-Length, Fast Start, No Progress
Reponse.BufferOutput = False;
This results in immediate start, however end client (typical browser like Chrome) does not know Content-Length as IIS does not know either, so it does not display progress, instead it says X KB downloaded.
Buffering Off, Manual Content-Length, Fast Start, Progress and Protocol Violation
Response.BufferOutput = False;
Response.AddHeader("Content-Length", file.Length);
This results in correct immediate file download in Chrome etc, however in some cases IIS handler results in "Remote Client Closed Connection" error (this is very frequent) and other WebClient results in protocol violation. This happens 5 to 10% of all requests, not every requests.
I guess what is happening is, IIS does not send anything called 100 continue when we dont do buffering and client might disconnect not expecting any output. However, reading files from source may take longer time, but at client side I have increased timeout but seems like IIS timesout and have no control.
Is there anyway I can force Response to send 100 continue and not let anyone close the connection?
UPDATE
I found following headers in Firefox/Chrome, nothing seems unusual here for Protocol Violation or Bad Header.
Access-Control-Allow-Headers:*
Access-Control-Allow-Methods:POST, GET, OPTIONS
Access-Control-Allow-Origin:*
Access-Control-Max-Age:1728000
Cache-Control:private
Content-Disposition:attachment; filename="24.jpg"
Content-Length:22355
Content-Type:image/pjpeg
Date:Wed, 07 Mar 2012 13:40:26 GMT
Server:Microsoft-IIS/7.5
X-AspNet-Version:4.0.30319
X-Powered-By:ASP.NET
UPDATE 2
Turning Recycling still did not offer much but I have increased my MaxWorkerProcess to 8 and I now get less number of errors then before.
But on an average, out of 200 requests in one second, 2 to 10 requests fail.., and this happens on almost every alternate seconds.
UPDATE 3
Continuing 5% of requests failing with "The server committed a protocol violation. Section=ResponseStatusLine", I have another program that downloads content from the webserver which uses WebClient, and which gives this error 4-5 times a second, on an average I have 5% of requests failing. Is there anyway to trace WebClient failure?
Problems Redefined
Zero Byte File Received
IIS closes connection for some reason, on client side in WebConfig, I receive 0 bytes for the file which is not zero bytes, We do SHA1 hash check, this told us that in IIS web server, no error is recorded.
This was my mistake, and its resolved as we are using Entity Framework, it was reading dirty (uncommitted rows) as read was not in transaction scope, putting it in transaction scope has resolved this issue.
Protocol Violation Exception Raised
WebClient throws WebException saying "The server committed a protocol violation. Section=ResponseStatusLine.
I know I can enable unsafe header parsing but that is not the point, when it is my HTTP Handler that is sending proper headers, dont know why IIS is sending anything extra (checked on firefox and chrome, nothing unusual), this happens only 2% of times.
UPDATE 4
Found sc-win32 64 error and I read somewhere that WebLimits for MinBytesPerSecond must be changed from 240 to 0, still everything is same. However I have noticed that whenever IIS logs 64 sc-win32 error, IIS records HTTP Status as 200 but there was some error. Now I cant turn on Failed Trace Logging for 200 because it will result in massive files.
Both of above problems were solved by increasing MinBytesPerSecond and as well as disabling Sessions, I have added detailed answer summarizing every point.
When you have set the content length with the bufferOutput to false then the possible reason of the fails is because IIS try to gzip the file you send, and by set the Content-Length IIS can not change it back to the compressed one, and the errors starts (*).
So keep the BufferOutput to false, and second disable the gzip from iis for the files you send - or disable the iis gzip for all files and you handle the gzip part programmatically, keeping out of gzip the files you send.
Some similar questions for the same reason:
ASP.NET site sometimes freezing up and/or showing odd text at top of the page while loading, on load balanced servers
HTTP Compression: Some external scripts/CSS not decompressing properly some of the time
(*) why not change it again ? because from the moment you set a header you can not take it back, except if you have enable this option on IIS and take care that the header have not all ready send to the browser.
Follow up
If not gziped, the next thing it came to my mind is that the file is sent and for some reason the connection got delayed, and got a timeout and closed. So you get the "Remote Host Closed The Connection".
This can be solved depending on the cause:
Client really closed the connection
The timeout is from the page itself, if you use handler (again, probably, the message must be "Page Timed Out" ).
The timeout is coming from the idle waiting, the page take more than the execution time, gets a timeout and close the connection. Maybe in this case the message was the Page Timed Out.
The pool make a recycle the moment you send the file. Disable all pool recycles! This is the most possible cases that I can think of right now.
If it is coming from the IIS, go to the web site properties and make sure you set the biggest "Connection Timeout", and "Enable HTTP Keep-Alives".
The page timeout by changing the web.config (you can change it programmatically only for one specific page)
<httpRuntime executionTimeout="43200"
Also have a look at :
http://weblogs.asp.net/aghausman/archive/2009/02/20/prevent-request-timeout-in-asp-net.aspx
Session lock
One more thing that you need to examine is to not use session on the handler that you use to send the file, because the session locks the action until finish out and if a user take longer time to download a file, a second one may get time out.
some relative:
call aspx page to return an image randomly slow
Replacing ASP.Net's session entirely
Response.WriteFile function fails and gives 504 gateway time-out
Although correct way to deliver the big files in IIS is the following option,
Set MinBytesPerSecond to Zero in WebLimits (This will certainly help in improving performance, as IIS chooses to close clients holding KeepAlive connections with smaller size transfers)
Allocate More Worker Process to Application Pool, I have set to 8, now this should be done only if your server is distributing larger files. This will certainly cause other sites to perform slower, but this will ensure better deliveries. We have set to 8 as this server has only one website and it just delivers huge files.
Turn off App Pool Recycling
Turn off Sessions
Leave Buffering On
Before each of following steps, check if Response.IsClientConnected is true, else give up and dont send anything.
Set Content-Length before sending the file
Flush the Response
Write to Output Stream, and Flush in regular intervals
What I would do is use the not so well-known ASP.NET Response.TransmitFile method, as it's very fast (and possibly uses IIS kernel cache) and takes care of all header stuff. It is based on the Windows unmanaged TransmitFile API.
But to be able to use this API, you need a physical file to transfer. So here is a pseudo c# code that explain how to do this with a fictional myCacheFilePath physical file path. It also supports client caching possibilities. Of course, if you already have a file at hand, you don't need to create that cache:
if (!File.Exists(myCacheFilePath))
{
LoadMyCache(...); // saves the file to disk. don't do this if your source is already a physical file (not stored in a db for example).
}
// we suppose user-agent (browser) cache is enabled
// check appropriate If-Modified-Since header
DateTime ifModifiedSince = DateTime.MaxValue;
string ifm = context.Request.Headers["If-Modified-Since"];
if (!string.IsNullOrEmpty(ifm))
{
try
{
ifModifiedSince = DateTime.Parse(ifm, DateTimeFormatInfo.InvariantInfo);
}
catch
{
// do nothing
}
// file has not changed, just send this information but truncate milliseconds
if (ifModifiedSince == TruncateMilliseconds(File.GetLastWriteTime(myCacheFilePath)))
{
ResponseWriteNotModified(...); // HTTP 304
return;
}
}
Response.ContentType = contentType; // set your file content type here
Response.AddHeader("Last-Modified", File.GetLastWriteTimeUtc(myCacheFilePath).ToString("r", DateTimeFormatInfo.InvariantInfo)); // tell the client to cache that file
// this API uses windows lower levels directly and is not memory/cpu intensive on Windows platform to send one file. It also caches files in the kernel.
Response.TransmitFile(myCacheFilePath)
This piece of code works for me.
It starts the data stream to client immediately.
It shows progress during download.
It doesn't violate HTTP. Content-Length header is specified and the chuncked transfer encoding is not used.
protected void PrepareResponseStream(string clientFileName, HttpContext context, long sourceStreamLength)
{
context.Response.ClearHeaders();
context.Response.Clear();
context.Response.ContentType = "application/pdf";
context.Response.AddHeader("Content-Disposition", string.Format("filename=\"{0}\"", clientFileName));
//set cachebility to private to allow IE to download it via HTTPS. Otherwise it might refuse it
//see reason for HttpCacheability.Private at http://support.microsoft.com/kb/812935
context.Response.Cache.SetCacheability(HttpCacheability.Private);
context.Response.Buffer = false;
context.Response.BufferOutput = false;
context.Response.AddHeader("Content-Length", sourceStreamLength.ToString (System.Globalization.CultureInfo.InvariantCulture));
}
protected void WriteDataToOutputStream(Stream sourceStream, long sourceStreamLength, string clientFileName, HttpContext context)
{
PrepareResponseStream(clientFileName, context, sourceStreamLength);
const int BlockSize = 4 * 1024 * 1024;
byte[] buffer = new byte[BlockSize];
int bytesRead;
Stream outStream = m_Context.Response.OutputStream;
while ((bytesRead = sourceStream.Read(buffer, 0, BlockSize)) > 0)
{
outStream.Write(buffer, 0, bytesRead);
}
outStream.Flush();
}

Request handled okay when issued by browser, but not when issued by ASP.NET application on the same machine

I'm debugging two ASP.NET applications running on the same machine under different instances of Cassini and with "custom errors" off. One app is running on port 100 and wants to perform a GET request from the other app running on port 90. So it runs this code:
WebRequest request = WebRequest.Create(
"http://localhost:90/Controller/Action?Param1=foo&Param2=bar");
request.Timeout = 10000;
request.GetResponse();
and the last line throws a WebException with HTTP 400 code and null InnerException. If I copy the very same URL in clipboard, past it into IE running on the same machine - the request is queued to the app on port 90 and its /Controller/Action/ is invoked and even parameters are passed okay.
What could be the problem origin here and how do I solve it?
I think you should try without the params in the url.
WebRequest request = WebRequest.Create("localhost:90/Controller/Action");
request.Timeout = 10000;
request.GetResponse();
if it does work you need to add some user-agent headers to allow the use of params.
Also you should probably look at WebClient.
MSDN
personally I would also look at using IISExpress or IIS to develop this kind of solution.
Just an outsider's observation here, consider making this call to the second webmethod via an ajax call from the browser and aggregate the results clientside using javascript (jQuery).
I would try and use the overload of WebRequest.Create that takes a URI object, that way you can rule out a fat-fingered URL.
Two hours debugging - and it turned out that service at port 90 would redirect the request back to the service at port 100 but wouldn't provide a required parameter in the URL, so the handler in the service at port 100 would throw an exception and return the HTTP 400 which was then reported by the GetResponse(). The solution was to change the logic so that there's no redirect for this specific request because the redirect would make no sense for this specific request.
And the jury finds both Cassini and ASP.NET to be not guilty.

First request fails with HTTP 400 (Bad Request) after reading HttpRequest.InputStream

I develop an asmx web service (i.e. ASP.NET 2.0).
There's a piece of code that may read the contents of the HTTP request (via HttpContext.Current.Request.InputStream) while processing it. I realise that InputStream may only be read once for a request, and I make sure I never try to read it more than once.
The problem seems to be that if InputStream happens to be read during the early stages of the application's lifecycle (e.g. after pskill w3wp, during Application_Start), the HTTP request fails with a HTTP 400 - Bad Request error, with no explanation given, no exception thrown and no entry in the httperr log. If it is read later (e.g. within the web method itself), requests run fine whether InputStream is read or not. Application_Start runs fine if InputStream isn't read.
Is this some sort of ASP.NET bug? IIS bug? Or am I doing something wrong by daring to read InputStream? And if so, is there another way to get a look at the "raw" contents of the request without disturbing the inner workings of IIS/ASP.NET?
In short, adding this code within Application_Start is enough to reproduce this error:
using (StreamReader reader = new StreamReader(HttpContext.Current.Request.InputStream))
reader.ReadToEnd();
Couldn't find a way to read request contents during Application_Start without disturbing inner workings of ASP.NET/IIS. Instead, ended up making sure this doesn't happen until Application_Start is over (and also doesn't happen from the moment Application_End starts, which also turned out to be problematic and created access violations).
You have to no use the using bloc because this has as effect to close the reader and by consequence to close the inputstream of httprequest
I would suggest not attempting to read the Request.InputStream during Application_Start - it's used for initialising the application. Accessing the Request object from within Application_Start results in an exception "Request is not available in this context."
The fact that you are wanting to read the input stream suggests you should be using Application_BeginRequest instead - this has access to request and response.
In Summary:
Application_Start
Fires once when the application starts. While usually triggered by the first request, it occurs before the first request is set up. Don't do request specific code in here as it doesn't have access to Request and Response.
Application_BeginRequest
Fires for every request before any page handlers are invoked - you can read the input, write to response, end the request, etc...
See these SO articles for more info:
Request is not available in this context
Replacement for HttpContext.Current.Request.ServerVariables["SERVER_NAME"] in Integrated Mode
When does HttpRequest get created?

Calling across two Visual Studio localhosts with WebClient

I have two ASP.NET projects in my solution, and run on different localhost ports when I start debugging. I have a generic handler in site A, which is called by site B:
String url = "http://localhost:1234/UrlOnSiteA.ashx";
WebClient client = New WebClient();
client.Credentials = CredentialCache.DefaultNetworkCredentials;
client.OpenRead(url);
The OpenRead call throws an exception with a 500 error, and I don't know why. The error message is:
System.Net.WebException: The remote server returned an error: (500) Internal Server Error.
Other info:
A breakpoint on the very first line of the handler code isn't hit.
The handler runs properly (and hits the breakpoint) when its URL is used in a browser.
Visual Studio 2008 Professional, running .NET 2.0 sites.
I suspect it's a configuration issue. Any ideas?
I'd diagnose this first by figuring out what's causing the exception. Look at the Response property of the WebException, and read the HTML returned. Any clues? (You may need to disable custom errors in your web.config to see the actual error response.)
Another approach to get the same response info would be to use Fiddler, and set the proxy of your WebClient call to the Fiddler proxy address. Then you can use fiddler to see the response HTML.
A slightly different approach would be to change the Exceptions settings in Visual Studio to break into the debugger whenever a WebException is thrown. You can do this from the Debug...Exceptions... dialog box.

Resources