When working with HTTP modules, has anyone noticed that the final two events in the pipeline -- PreSendRequestHeaders and PreSendRequestContent -- don't always run?
I've verified that code bound to EndRequest will run, but will not when bound to either PreSendRequestHeaders or PreSendRequestContent.
Is there a reason why? I thought perhaps it was a caching issue (with a 304 Not Modified, you don't actually send content...), but I've cleared caches and determined that the server is returning 200 OK, which would indicate that it sent content.
This is a problem because the StatusCode of the response defaults to 200 and my understanding is that it doesn't get updated to something like a 404 or 206 until those two final methods. If I check the StatusCode during EndRequest, it will always read 200.
isn't this related to the IIS 7 integrated pipeline?
To be verified, but I think that thoses events are only triggered when IIS 7 is running in integrated pipeline.
I'm injecting a cookie header on the PreSendRequestHeaders event and have yet to run into an issue of it not firing...
Maybe it has to do with HttpResponse.BufferOutput. If buffering is turned off, it seems like all of the headers and some of the content would have already been sent by the time these events fire.
Related
I have inherited this code which runs a 1-second-JQuery-Ajax-loop on the client side. It used to heavily exploit cookies and I am trying to change it to plain stateless HTTP at least, but now I have the following problem:
Every POST from the client is processed, and the first few GETs too, but after a short while the server-side HttpHandler is not even called on GET requests and the client code success callbacks always get passed the same - non-updated - data.
//edit: since people tend to assume otherwise: I have stepped through the code with a debugger, so when I say "handler is not called on get requests" and "client code success callbacks get passed the same data always" I mean that quite literally.
I figure this might be a problem of the Web Server caching responses to HTTP requests, but it's kind of a wild guess.
So I have a bunch of questions which might help me solve such problems in the future:
Is this a reasonable theory?
I would like to somehow have an overview over all the HTTP requests
the server registers and how he chooses to process them.
Also, where and how would I go about configuring the server beyond
the web.config, if for example I wanted to configure its caching
behaviour?
It's the clientside cache which is causing this.
Set cache to false on your AJAX request.
$.ajax({
url: "http://your.url.here",
cache: false
})
.done(function(data) {
// ...
});
More details here.
I'm trying to implement cache control on my application. I've set up the tomcat filter for the all fonts giving a max-age=120.
When I request a font for the first time with the cache cleared, the call/response is the following:
and as you can see I have the max-age response. Now I expect that if I hit refresh the browser won't send the http request again instead this is what happens:
As you can see the second request has a
cache-control: max-age=0
value and the response is returned from the server cache. What I'm trying to achieve is to block the entire call from the browser.
Am I doing something wrong?
Thanks
Hitting refresh has semantics that are dependent upon the browser you're using, but often it will make a conditional request to make sure the user is seeing a fresh response (because they wanted to refresh).
If you want to check cache operation, try navigating to the page, rather than hitting refresh.
OTOH if you don't want refresh to behave like this -- and you really mean it -- Mozilla is prototyping Cache-Control: immutable to do this (but it's early days, and mob-only for the moment).
I am working on a relatively complex asp.net web forms application, which loads user controls dynamically within update panels. I've run into a very peculiar problem with Internet Explorer where after leaving the page idle for exactly one minute you receive a Sys.WebForms.PageRequestManagerParserErrorException javascript exception when the next request is made. This doesn't happen in Firefox and Chrome. When the server receives the bad request, the body is actually empty but the headers are still there. The response that is sent back is a fresh response you would get from a GET request, which is not what the update panel script is expecting. Any requests done within a minute are okay. Also any requests made following the bad request are okay as well.
I do not have any response writes or redirects being executed. I've also tried setting ValidateRequest and EnableEventValidation in the page directive. I've looked into various timeout properties.
The problem resided with how IE handles NTLM authentication protocol. An optimization in IE that is not present in Chrome and Firefox strips the request body, which therefore creates an unexpected response for my update panels. To solve this issue you must either allow anonymous requests in IIS when using NTLM or ensure Kerberos is used instead. The KB article explains the issue and how to deal with it.KB251404
I'm doing some diagnostic logging in the Page_Unload event in an asp.net application, this logging can take a fair bit of time (about 100ms). Will the response stream get held up by the code in the Page Unload handler? I could do my work asynchronously by using the theadpool but I'd rather not if it won't affect the client's response time.
More information:
#thorkia is correct in that the documentation says that Page_Unload is called after the response is sent to the client, but in my testing (as advised by #steve) it does block. I've tried Casini, IIS Express, Full IIS 7.5 (on a test server) with both release and debug builds, with and without a debugger attached. And, grasping at straws, I tried putting Async=true in the Page Directive. I tried with Fiddler (streaming enabled), and without Fiddler. I've tried with IE9, and Firefox. If the documentation is "correct" then I wonder it it does send the response but perhaps doesn't "finish it off" (what ever that means I'll need to check the HTTP spec) and so the page doesn't render in the browser? But my understanding was that a client browser starts to render the page as it receives the bytes to this doesn't make sense to me either. I've also tried looking at the code in IL Spy but I think this might take me a lot of time.
Now I'm intrigued; am I doing something wrong, or is the documentation misleading?
Why not try it?
protected void Page_UnLoad(object sender, EventArgs e)
{
System.Diagnostics.Debug.WriteLine("In Page_UnLoad");
System.Threading.Thread.Sleep(10000);
System.Diagnostics.Debug.WriteLine("Leaving Page_UnLoad");
}
According to MSDN (http://msdn.microsoft.com/en-us/library/ms178472.aspx) the Page Unload stage is only called after the data has been sent to the client.
Taking a long time to do your logging and clean up will not affect the clients response time for that request but could affect future requests if lots of pages are waiting to be unloaded.
When analyzing traffic with a packet sniffer, we are seeing an http response from a weblogic server prior to the completion of the http post to that server.
In this case, the jsp page on the server is basically a static page, no logic to do anything with the contents of the post at this time.
But why would the server send the response prior to completion of the post?
I found Weblogic documentation about how to configure the server to ignore a denial-of-service attack using Http post. Maybe that is what is happening?
No one I know has seen this behaviour before. Maybe some weblogic-savvy person will know what is going on.
Thanks
I don't think that Weblogic is analyzing the JSP to determine whether it is static or not.
My guess is that either
someone else was accessing the server at the same time
you saw the answer to a previous request
[EDIT] To determine what is going on, I suggest to set a breakpoint in the JSP. If you still get an answer without hitting the breakpoint, something further up the stack must be intercepting the request (for example, a cache).