ASP.NET: are aspx/ascx files accessed from disk on every request? - asp.net

I googled forever, and I couldn't find an answer to this; the answer is either obvious (and I need more training) or it's buried deep in documentation (or not documented). Somebody must know this.
I've been arguing with somebody who insisted on caching some static files on an ASP.NET site, where I thought it's not necessary for a simple fact that all other files that produce dynamic HTML are not cached (by default; let's ignore output caching for now; let's also ignore the caching mechanism that person had in mind [in-memory or out on network]). In other words, why cache some xml file (regardless on how frequently it's accessed) when all aspx files are read from disk on every request that map to them? If I'm right, by caching such static files very little would be gained (less disk-read operations), but more memory would be spent (if cached in memory) or more network operations would be caused (if cached on external machine). Does somebody know what in fact happens when an aspx file is [normally] requested? Thank you.

If I'm not mistaken ASPX files are compiled at run-time, on first access. After the page is compiled into an in-memory instance of a Page class, requests to the same resource (ASPX page) are serviced against the object in memory. So in essence, they are cached with respect to disk-access.
Obviously the dynamic content is generated for every request, unless otherwise cached using output caching mechanisms.
Regarding memory consumption vs disk access time, I have to say that from the performance stand point it makes sense to store objects in memory rather than reading them from disk every time if they are used often. Disk access is 2 orders of magnitude slower than access in RAM. Although inappropriate caching strategies could push frequently used objects out of memory to make room for seldom used objects which could hurt performance for obvious reasons. That being said, caching is really important for a high-performance website or web application.
As an update, consider this:
Typical DRAM access times are between 50 - 200 nano-seconds
Average disk-access times are in the range of 10 - 20 milliseconds
That means that without caching a hit against disk will be ~200 times slower than accessing RAM. Of course, the operating system, the hard-drive and possible other components in between may do some caching of their own so the slow-down may only occur on first hit if you only have a couple such files you're reading from.
Finally, the only way to be certain is to do some benchmarking. Stress-test both implementations and choose the version that works best in your case!

IIS does a large amount of caching, so directly, no. But, IIS checks for ANY changes in the web directory and reloads any changed files as they get changed. Sometimes IIS gets borked and you have to restart it to detect changes, but usually it works pretty good.
P.S. The caching mechanisms may flush data frequently based on server usage, but the caching works for all files in the web directory. Any detected changes to source code causes IIS to flush the web applicaiton and re-compile/re-load as well.

I believe that the answer to your question depends on both the version of IIS you're using, and configuration settings.
But I believe that it's possible to configure some combinations of IIS/.Net to avoid checking the files - there's an option to pre-compile sites, so no code actually needs to be deployed to the web server.

Related

Caching image handler output strategies

Firstly I appreciate that this question could be seen as subjective but I strongly believe that there should be and probably is a definitive answer to my question.
At work we are currently implementing a strategy for dynamically resizing and serving images using a generic handler and the question of caching has become something of a contentious issue.
In my original implementation the resized image is cached in memory with a cache dependency based on the original image.
e.g
using (MemoryStream ms = new MemoryStream())
{
imageEditor.Image.Save(ms, imageFormat);
// Add the file to the cache.
context.Cache.Insert(key,
ms.ToArray(),
new System.Web.Caching.CacheDependency(path)
);
imageEditor.Dispose();
// Set the context headers and serve.
SetHeaders(ms.GetHashCode(), context, responseType);
context.Response.BinaryWrite(ms.ToArray());
}
This has it's downsides though.
Every time the Application Pool Worker Process is recycled (every 1740 minutes by default) we'll lose anything that is in the cache.
If there are a lot of images we could be in danger of overloading the system memory and causing an out of memory exception. (Does IIS prevent this by recycling the App pool if the usage hits a certain level?)
One of my colleagues has suggested that we implement a file caching system instead that saves the resized file and instead serves that file on subsequent requests which should (I don't know the intricacies of operating system IO caching memory management) reduce memory usage. Whilst this would allow us to persist the resized image across recycles I see a few major problems with that approach:
We cannot track the original file any more so if someone uploads a
new image of the same name our resized images will be incorrect.
The fileserver becomes polluted over time with thousands of images
Reading a file is slow compared to reading from memory. Especially if you are reading multiple files.
What would be the best overall approach? Is there a standard defined somewhere by Microsoft? The sites we build are generally very busy so we'd really like to get this right and to the best possible standard.
We have a similar system on my website. Regarding your objections to the file caching system:
1,2) On my site I have a single class through which all file saving/loading passes. You could implement something similar, that clears all the cached,resized images whenever a user uploads a new image. If you name the files in a predictable manner this isn't hard to do. If storage space is a concern for you, you could implement something to remove all cached images with a last access date that is too old.
3) This depends on how your sites work. My site has a vast number of images, so it isn't feasible to store them all in memory. If your site has fewer images, it might be the better solution.
This is not a complete answer to your question but, you shouldn't be able to cause a system of out memory exception by putting to much stuff into the cache. If system memory should start running low the application cache will automatically start removing the unimportant and seldom used items in order to avoid causing any memory issues.
You may use ASP.NET Generated Image. Interesting article by Scott Hanselman - ASP.NET Futures - Generating Dynamic Images with HttpHandlers gets Easier.

Checklist for ASP.NET / Database performance

Recently our customers started to complain about poor performance on one of our servers.
This contains multiple large CMS implementations and alot small websites using Sitefinity.
Our Hosting team is now trying to find the bottlenecks in our environments, since there are some major issues with loadtimes. I've been given the task to specify one big list of things to look out for, devided into different the parts (IIS, ASP.NET, Web specific).
I think it'd be good to find out how many instances of the Sitecore CMS we can run on one server according to the Sitecore documentation e.d. We want to be able to monitor and find out where our bottleneck is at this point. Some of our websites load terribly slow, other websites load very fast. Most of our Sitecore implementations that run on this server have poor back-end performance, and have terrible load times after a compilation.
Our Sitecore solutions run on a Win 2008 64 server with Microsoft SQL Server 2008 for db's.
I understand that it might be handy to specify more detailed information about our setup, but I'm hoping we'd be able to get some usefull basic information regarding how to monitor and find bottlenecks e.d.
What tools / hints / tips & tricks do you have?
do NOT use too many different asp.net pools, called and as dedicate pool in plesk. Place more sites on the same pool.
More memory, or stop non used programs/services on the server
Check if you have memory limits on the application pool that make the pool continues auto-restarts.
On the database, set Recovery Mode to simple.
Shrink database files, and reindex database, from inside the program
after all that Defrag your disks
Check the memory with process explorer.
To check whats starts with your server use the autoruns but be careful not to stop any critical service and the computer never starts again. Do not stop services from autoruns, use the service manager to change the type to manual. Also many sql serve services they not need to run if you never used them.
Some other tips
Move the temporary files / and maybe asp.net build directory to a different disk
Delete all files from temporary dir ( cd %temp% )
Be sure that the free physical memory is not zero, using the process exporer. If its near zero, then your server needs memory, or needs to stop non using programs from running.
To place many sites under the same pool, you need to change the permissions of the sites under the new share pool. Its not difficult, just take some time and organize to know what site runs under what pool. Now let say that you have 10 sites, its better to use 2 diferent pools, and spread the sites on this pools base on the load of each site.
There are no immediate answer to Sitecore performance tuning. But here are some vital tips:
1) CACHING
Caching is everything. The default Sitecore cache parameters are rarely correct for any application. If you have lots of memory, you should increase the cache sizes:
http://learnsitecore.cmsuniverse.net/en/Developers/Articles/2009/07/CachingOverview.aspx
http://sitecorebasics.wordpress.com/2011/03/05/sitecore-caching/
http://blog.wojciech.org/?p=9
Unfortunately this is something the developer should be aware of when deploying an installation, not something the system admin should care about...
2) DATABASE
The database is the last bottleneck to check. I rarely touch the database. However, the DB performance can be increased with the proper settings:
Database properties that improves performance:
http://www.theclientview.net/?p=162
This article on index fragmentation is very helpful:
http://www.theclientview.net/?p=40
Can't speak for Sitefinity, but will come with some tips for Sitecore.
Use Sitecores caching whenever possible, esp. on XSLTs (as they tend to be simpler than layouts & sublayouts and therefore Sitecore caching doesn't break them, as Sitecore caching does to asp.net postbacks), this ofc will only help if rederings & sublayouts etc are accessed a lot. use /sitecore/admin/stats.aspx?site=website to check stuff that isn't cached
Use Sitecores profiler, open up an item in the profiler and see which sublayouts etc are taking time
Only use XSLTs for the simplest content, if it get anymore complicated than and I'd go for sublayouts (asp.net controls), this is a bit biased as I'm not fond of XSLT, but experience indicates that .ascx's are faster
Use IIS' content expiration on the static files (prob all of /sitecore and if you have some images, javascript & CSS files) this is for IIS 6: msdn link
Check database access times with Sitecore Databasetest.aspx (the one for Sitecore 6 is a lot better than the simple one that works on Sitecore 5 & 6) Sitecore SDN link
And that's what I can think of from the top of my head.
Sitecore has a major flaw, its uses GUIDs for primary keys (amongst other poorly chosen data types), this fragments the table from the first insert and if you have a heavily utilised Sitecore database the fragmentation can be greater than 90% within an hour. These is not a well-designed database and recommend looking at other products until they fix this, it is causing us a major performance headache (time and money).
We are at a stand still we cannot add anymore RAM cannot rebuild the indexes more often
Also, set your IIS to recycle the app_pool ONLY once a day at a specific time. I usually set mine for 3am. This way the application never goes to sleep, recycle or etc. Best to reduce spin up times.
Additionally configure IIS to 'always running' instead of 'on starup'. This way, when the application restarts, it recompiles immediately and again, is ready to roar.
Sitefinity is really a fantastic piece of software (hopefully my tips above get the thumbs up, and not my endorsement of the product). haha

Which Has Better Performance - Configuration In AppSettings or Database?

I'm looking for feedback as to where I should store application configuration data and default text values that will have the best performance overall. For example, I've been sticking stuff like default URL values and the default text for error messages or application instructions inside the web.config, but now I'm wondering if that will scale....and if it's even the right thing to do in the first place.
As mentioned before, this really shouldn't matter - the settings, be they in the web.config or in a database, should be 'read-once' and then cached, so this really shouldn't matter.
I can almost guarantee that there will be other parts of your code much slower than this.
As a side note, and not performance related, but if you need to worry about site uptime, you can edit configuration in a database on the fly, but changing the web.config will cause an appdomain restart and subsequent loss of sessions.
If it's a single server setup (as opposed to a Web farm) store it in the Web.Config file. There is a new release of the Web Farm framework and you could check details for that type of scenario here:
http://weblogs.asp.net/scottgu/archive/2010/09/08/introducing-the-microsoft-web-farm-framework.aspx
If the database is on the same machine performance difference may be neglect-able. If you need to cross a wire to get to some database it's most likely slower then a directly accessible and not too large web.config.
I really would prefer keeping things simple here, just use the web.config. It probably is already getting cached in memory by some system component. If you feel it's too slow measure it and then perhaps go for a more complicated solution.
Having said that you could simply read in all configuration at application start-up and hold it in memory. This way any performance difference is mitigated to just the application's start-up time. You also get the added benefit of being able to validate the configuration files at start-up.
Not sure about your defaults but just ask yourself if they are environment specific and is it really something you want to change without recompilation. Being configuration files, what is the use case? Is some operations guy or developer going to set these defaults? Are you planning to document them? Are you going to make your application robust against whatever is configured?
Are these defaults going to be used on multiple environments/installations (used for localization for instance)? Then perhaps it's better to use a different and separate file which you can redeploy separately when needed.
How often are the settings going to change?
If they never change, you can probably put them in web.config, and configure IIS so that it doesn't monitor the file for changes. That will impose a small startup penalty, but after that there should be no performance penalty.
But honestly, there are probably dozens of other places to improve before you start worrying about this - remember, "Premature Optimization is the root of all evil" :)

Can you solve my odd Sharepoint CSS cache / customising problem?

I have a weird situation with my sharepoint css.
It is deployed as part of a .wsp solution and up until now everything has been fine.
The farm it deploys too has a couple of webfront ends and a single apps server and SQL box.
The symptom is that if I deploy the solution, then use a webbrowser to view the page it has no styles, and if I access the .css directly I see the first 100 or so bytes of the .css.
However if I go into sharepoint designer and look at the file it is looks fine, and if I check it out and publish it (customising the file but not actually changing anything in it) then the website works fine and the css downloads completely.
There is some fairly complex caching on the servers Disk based and object caches. as far as I can tell I have cleared these (and an issreset should clear them anyway... shouldn't it?)
I have used this tool to clear the blobcache from the whole farm http://blobcachefarmflush.codeplex.com/
The problem you're describing is one I've encountered before. Let me share what I know, what I suspect, and how I'd go about troubleshooting your scenario.
First off, it sounds like you suspect caching as a potential problem source. In the case of the MOSS publishing feature set, you really have three different cache mechanisms in operation: the object cache, the BLOB cache, and the page output cache. The only mechanism that should be in-play, assuming it's turned on with default settings, is the BLOB cache. Neither the object cache nor the page output cache should be touching stand-alone stylesheets like you have.
You've tried flushing the cache the flush using the farm-level BLOB cache flush feature, and that will instruct MOSS to dump all BLOB cache data. You can verify this by reviewing the file system to ensure that only the three .bin folders remain following a flush.
To your specific question regarding an IISRESET: no, and IISRESET actually won't clear the BLOB cache. The contents of the BLOB cache persist beyond the life of the application pool servicing the web application. You either need to use a feature to clear out the cache (as you have been), or perform a manual file delete. I don't recommend the latter unless you absolutely have no other course of action. If you do elect to go the manual route to try it, ensure that you shutdown the W3SVC service before deleting files out of the file system. If you don't, the actual file deletion process can get into a race condition with cache repopulation and lead to corruption. After you've deleted files with a stopped W3SVC, you can start the W3SVC back up again.
For more information on the internals of the BLOB cache and how it operates, I'll point you to a blog article of mine: http://sharepointinterface.com/2009/06/18/we-drift-deeper-into-the-sound-as-the-flush-comes/
To see if the BLOB cache is a factor in the behavior you're seeing, you can modify the web.config for your web application(s) and adjust the file pattern to remove CSS from the list of file types in the <BlobCache> element and then restart IIS (or at least recycle the app pool).
Another possibility, based on experience, is that you're seeing something other than BLOB cache abnormalities. The key observation for me comes in the form of you observing that a direct request for the CSS stylesheet returns only the first 100 bytes or so.
Do you, by any chance, have any intelligent network hardware (that is, intrusion detection hardware or anything that might be performing application/layer-7 filtering) between the WFE and you, the caller? Intrusion detection and IPS systems are the source of many of the types of problems you're seeing, and they're one of my first stops whenever I see "oddball" behavior like you're describing. In the case of one of my clients, I saw a problem meeting your description (CSS and JS files getting truncated) due to an intervening Juniper firewall with active IPS. Turning off IPS (to test) cleared things up immediately. After that, the networking team sought an update from Juniper to correct the issue to ensure that IPS could remain active.
Try turning off BLOB caching (or removing the CSS extension from the file pattern) to see if that makes a difference. If not, talk to your network team to see if something is happening to the response stream coming back to you. That's where I'd start; hopefully, one of those two things will do the trick for you.
Small side note: if you have a free moment and are up to it, I'd like to hear about your experience with the BlobCacheFarmFlush solution you pulled down from CodePlex. I authored it, and I'd love to hear your thoughts -- good or bad :-)
Sean (sean#sharepointinterface.com)

Can I use the ASP.NET 'OutputCache' control to cache images without a performance hit?

I have some ASP.NET MVC actions that generate images dynamically (although it could equally be an ASPX page).
I'm using [OutputCache] to cache these images. I'm just wondering if I need to worry about ASP.NET caching images in memory and taking up too many resources. These are product images of varying sizes for a shopping cart containing only a couple dozen products.
Will OutputCache use disk or just in memory? How intelligent is it? Or should I just save the images to disk myself and implement my own caching system (which is actually the current implementation) ?
For all intents and purposes, I believe that the output cache is completely in-memory - meaning that if the app pool is recycled, the image will need to be generated again.
I've had to do something similar in the past, and I actually implemented a two-tiered system that used the HTTP cache primarily, and used the filesystem as a fallback. If something didn't exist, I generated the image and saved it to disk AND put it in the cache. That way if it gets pushed out of the cache or the app pool recycles, I only have to load it off the disk (it appears you've done the same).
As far as "too much memory", if you explicitly use HttpContext.Cache instead of [OutputCache], you can control the priority of the item in the cache. You can then tweak the settings on your app pool to control how much memory it uses overall, but I'm not sure there's a whole lot to be done other than that. A couple images * 12 products doesn't seem like it would take up a whole lot of memory to me though.
Without knowing anything else about your application, it sounds to me like you could get away with just using the outputcache. However, if you need something more robust and scalable, I'd use the two-tiered system I described. Though, if you've already got that implemented and working, "if it ain't broke..."

Resources