What is the difference between kernel mode caching and user mode caching and how to track both ?
Kernal Mode caching is essentially going to handle caching requests at the OS-level, so contents that are stored in it can be accessed without ever going down the rest of the usual pipeline (i.e. it will not have to go down to the ASP.NET or IIS-level caches to check for the contents) :
So the request hits the initial cache (http.sys), finds what it needs and sends it back, all without ever having to proceed further down the pipeline.
As a the result of this, it's usually quite fast. A limitation of it however is that it does not support many user-level features such as authentication and authorization, so it may not fit all scenarios.
User-mode on the other hand is going to fill in the gaps where Kernal-mode cannot be used, which primarily surrounds authorized/authenticated content (as it requires a check to see if the user can actually access the contents), but there are many other scenarios that could cause the http.sys cache to not be used.
With regards to actually checking to see if content is or is not being cached (and possibly why), you can use FREB (Failed Request Event Buffering). The following command can be used to find out which content is cached in kernel mode:
netsh http show cachestate
Related
We have a .NET WPF container app in which we host several web apps using CEFSharp.WinForms control. At times, we see that for some users, some JavaScript resource requests fail with the ERR_CONTENT_DECODING_FAILED error message. This issue gets resolved if we reload the app after either clearing the CEF cache or after disabling the cache from the network tab in the developer toolbar window. Please note that this issue isn't confined to a specific subset of resource files: instead, we have seen it happening sporadically for a variety of JavaScript resource files (some hosted on Apache while the others hosted on IIS servers).
While a possible cause for usual ERR_CONTENT_DECODING_FAILED error is a server-side content-encoding issue, in this specific case, we believe this could potentially be related to the CEF browser caching. Please see the analysis section below for the reasons we believe so.
Application Setup
When we initialize CEF settings, we set MultiThreadedMessageLoop setting to true and set CachePath property to a location under %localappdata% on windows 10 machine. When the container app starts, it creates three CEF web browser controls and launches web apps in them. All three apps load concurrently. After that, more CEF web browsers are created as the user visits more apps. The user also reloads some of these apps over time. All the web apps are internal apps sharing the same domain but physically hosted on different web servers. The JavaScript resource files in question usually have caching policy set to allow them to be cached for a week.
CEFSharp version - 79.1.360.0
CEF version - r79.1.36+g90301bd+chromium-79.0.3945.130
Chromium version - 79.0.3945.130
Our Analysis so far
We checked the web-server logs for the failing JavaScript resources. We observed that in most cases, the server requests for those resource files (by the impacted user) were made a few days ago. The users are usually able to use the application well for some days before they sporadically start getting this error.
We checked the network logs (*.HAR file). We see that for the failing JavaScript resource, _transferSize is 0 (which seems to indicate that response was served from the cache)
When the error occurs, it gets resolved when we reload the app after either clearing the cache or disabling the cache from the network tab.
We tried artificially simulating this error. We used Fiddler's autoresponder feature to deliberately respond with a bad server response (the content was 'gzip' encoded however Content-Encoding header indicated 'br'). We could simulate the ERR_CONTENT_DECODING_FAILED error, however, we could see in network logs that _tranferSize was a non-zero value. We also observed that chrome did not cache the bad response. This test indicates that when the original JavaScript response was cached by the browser, it must have been a correctly encoded response, or else the browser would not have cached it.
All of the above points lead us to believe that, JavaScript resource files were downloaded (with correct encoding) and cached in CEF cache. The user was also able to use the apps for some time. After that however, in certain scenarios, some of these files potentially got corrupted in CEF cache, leading to the content decoding error.
We tried using CEF response filter mechanism as explained here to capture the bad response when content decoding error occurs. Unfortunately, we observed that dataIn stream which gets passed to filter function is null when the response fails with this error.
Summary and Questions
This is a sporadic issue which our users are facing. We haven't found a way to deterministically recreate this problem. However based on our analysis so far, we believe some JavaScript files may be getting corrupted in CEF cache over time. We are not sure if the fact that we host several CEF web browsers and load them concurrently could be playing some role in causing this issue.
Has anyone else observed/reported a similar issue? Do you have any idea if we are missing or overlooking something here or going in the wrong direction? Any pointers will be greatly appreciated.
Using PerfMon, I can see that my ASP.NET Applications (Total)\Sessions Active is growing indefinitely to the tens of thousands, and I suspect this is causing a recent performance degradation we are observing.
The growth appears to be around a few dozen per minute.
We are using .Net 4.5 and IIS 7.5
How can I get a sample of some details regarding these sessions using administrative tools? What could cause this? What next steps can I take to diagnose this odd behavior?
Place a hook on the Session_OnStart event (more on those events at MSDN: https://msdn.microsoft.com/en-us/library/ms178583.aspx).
From there you should examine and escalate depending on the situation.
First, simply place a breakpoint inside of the event handler and do some normal browsing in your development environment. You can use incognito windows in chrome to achieve anonymity for the sake of creating sessions. If you need to do this in production then you should set up some sort of logging database or leverage your existing logging database to record the session requests (you can serialize them temporarily if you need all of the data).
Look at what the request path is for these sessions, and at some of the contextual data in general. If there are erroneous sessions the handler should be called multiple times per request and that should be immediately obvious. From there you can determine how to handle the extra paths or requests that are coming in.
Since you tagged this asp.net, it is hard to tell what exact version or framework associated with that you are using. However, in general I have noticed that many browsers will accidentally cause an extra session for requesting resources, especially the favicon.
It is highly recommended that you do not create sessions for favicons. In asp.net mvc you can do that by ignoring the route. In asp.net mvc you can also ignore excessive resources. That is done in the global.asax.cs file like this
routes.IgnoreRoute("{resource}.axd/{*pathInfo}");
routes.IgnoreRoute("favicon.ico");
See if you have a place in your application that you can do this if the extra sessions are being created as a result of these types of requests.
With caching headers I can either make the client not check online for updates for a certain period of time, and/or check for etags every time. What I do not know is whether I can do both: use the offline version first, but meanwhile in the background, check for an update. If there is a new version, it would be used next time the page is opened.
For a page that is completely static except for when the user changes it by themselves, this would be much more efficient than having to block checking the etag every time.
One workaround I thought of is using Javascript: set headers to cache the page indefinitely and have some Javascript make a request with an If-Modified-Since or something, which could then dynamically change the page. The big issue with this is that it cannot invalidate the existing cache, so it would have to keep dynamically updating the page theoretically forever. I'd also prefer to keep it pure HTTP (or HTML, if there is some tag that can do this), but I cannot find any relevant hits online.
A related question mentions "the two rules of caching": never cache HTML and cache everything else forever. Just to be clear, I mean to cache the HTML. The whole purpose of the thing I am building is for it to be very fast on very slow connections (high latency, low throughput, like EDGE). Every roundtrip saved is a second or two shaved off of loading time.
Update: reading more caching resources, it seems the Vary: Cookie header might do the trick in my case. I would like to know if there is a more general solution though, and I didn't really dig into the vary-header yet so I don't know yet if that works.
Solution 1 (HTTP)
There is a cache control extension stale-while-revalidate which describes exactly what you want.
When present in an HTTP response, the stale-while-revalidate Cache-
Control extension indicates that caches MAY serve the response in
which it appears after it becomes stale, up to the indicated number
of seconds.
If a cached response is served stale due to the presence of this
extension, the cache SHOULD attempt to revalidate it while still
serving stale responses (i.e., without blocking).
cache-control: max-age=60,stale-while-revalidate=86400
When browser firstly request the page it will cache result for 60s. During that 60s period requests are answered from the cache without contacting of the origin server. During next 86400s content will be served from the cache and fetched from origin server simultaneously. Only if both periods 60s+86400s are expired cache will not serve cached content but wait for origin server to fresh data.
This solution has only one drawback. I was not able to find any browser or intermediate cache which currently supports this cache control extension.
Solution 2 (Javascript)
Another solution is usage of Service workers with its feature to make custom responses to requests. With combination with Cache API it is enough to provide the requested feature.
The problem is that this solution will work only for browsers (not intermediate caches nor another http services) and even not all browsers supports Services workers and Cache API.
I have a web application that uses ASP.NET with "InProc" session handling. Normally, everything works fine, but a few hundred requests each day take significantly longer to run than normal. In the IIS logs, I can see that these pages (which usually require 2-5 seconds to run) are running for 20+ seconds.
I enabled Failed Request Tracing in Verbose mode, and found that the delay is happening in the AspNetSessionData section. In the example shown below, there was a 39-second gap between AspNetSessionDataBegin and AspNetSessionDataEnd.
I'm not sure what to do next. I can't find any reason for this delay, and I can't find any more logging features that could be enabled to tell me what's happening here. Does anyone know why this is happening, or have any suggestions for additional steps I can take to find the problem?
My app usually stores 1-5MB in session for each user, mostly cached data for searches. The server has plenty of available memory, and only runs about 50 users.
It could be caused by lock contention for the session state. Take a look at the last paragraph of MSDN's ASP.NET Session State Overview. See also K. Scott Allen's helpful post on this subject.
If a page is annotated with EnableSessionState="True" (or inherits the web.config default), then all requests for that page will acquire a write lock on the session state. All other requests that use session state -- even if they do not acquire a write lock -- are blocked until that request finishes.
If a page is annotated with EnableSessionState="ReadOnly", then the page will not acquire a write lock and so will not block other requests. (Though it may be blocked by another request holding the write lock.)
To eliminate this lock contention, you may want to implement your own [finer grained] locking around the HttpContext.Cache object or static WeakReferences. The latter is probably more efficient. (See pp. 118-122 of Ultra-Fast ASP.NET by Richard Kiessig.)
There is chance your are running up against the maximum amount of memory that Application Pool is allowed to consume, which causes a restart of the Application Pool (which would account for the delay you are seeing in accessing the session). The amount of memory on the server doesn't impact the amount of memory ASP.NET can use, this is controlled in the machine.config in the memoryLimit property and in IIS 6.0 later in IIS itself using the "Maximum memory used" property. Beyond that, have you considered alternatives to each user using 5 MB of session memory? This will not scale well at all and can cause a lot of issues while under load. Might caching be a more effective solution? Do the searches take so long that you need to do this, could the SQL/Database Setup be optimized to speed up your queries?
I would like know how browser executes/processes the request. I would like to know this because knowing how it works will help me understand how better web programming can be done which meets performance goals using browser features.
How browsers download CSS, JS and Image files?
Does it download one resource at a time or multiple?
How many parallel requests (connections) it can make?
What happens if request is getting executed on the server and user click on the stop button? Will the execution get complete and response will come back? Or on server site the request is suspended in half way?
How JS execution is handled by browser?
Please add helpful links/information if possible.
Thanks all,
Please consider splitting this up into multiple questions. Here is some relevant information:
A web browser, or any web client, who wants to retrieve an HTTP resource will construct a GET request. This contains information to route the request to the proper server, and information to tell the server which resource is being requested. A resource can be an HTML page, an image, a Javascript file, or anything else.
When the browser receives an HTML page, the page may have links to other resources (for instance, image tags). These instruct the browser to make further requests.
Multiple resources may be downloaded in parallel. This can happen if your browser is attempting to load multiple pages at once (like in different tabs), or if the browser has received an HTML page that points it to several resources (as in the last point). From a single hostname, the HTTP 1.1 spec says that at most two resources should be downloaded in parallel (though this is just a guideline and cannot stop a browser from attempting to do otherwise).
Javascript is interpreted by the browser, just like other scripting languages are interpreted by their respective engines.
In the usual way (e.g., http GET operation, etc.).
It's implementation-dependent, different browsers do it differently.
It's implementation-dependent; typically, though, no more than two at a time between the same two endpoints (e.g., that browser talking to the same server). May be more if retrieving from multiple servers. Other resources get queued and wait for a slot to open up. This limit is typically enforced by browsers, but may also be enforced by servers (so a browser with this limit lifted may still find that later requests sit waiting for a bit while the server queues them.).
It depends a lot on when they do that, what kind of server it is, etc.
In strict document order. The browser may download multiple script files simultaneously, but it will execute them in document order. This is very important. Further processing of the page may (probably will) get held up waiting for the script to get downloaded and run. (IE supports the defer attribute on script tags that lets you tell it that it can continue processing the page before it executes the script.)