To reduce the number of API calls we're making on our website, I built in a cache using ASP.Net's caching library.
The cache is currently set for 30 minutes.
The problem we're running into is that the download links seem to expire before this or reach a maximum of some sort and users are getting a page that tells them they don't have access to the download.
Now, if I reset the cache, it works.
So I'm looking for help / advice on how to better handle this.
1) I could just not cache anything and make each page load an individual API call, but that seems like unnecessary overhead.
2) I could reduce the cache lifetime.
3) I could build a separate service on our end that goes around the cache when the user clicks to download.
I'm a fan of #2 and #3, but I wondered if someone else could help offer some suggestions.
Thanks!
-Eric
Related
I have set up Caching for my website, which expires after an hour.
So my problem is if the Cache does not exist and Multiple users access the website at the same time I would like to avoid all the users making the same request at the same time. As this has an impact on the CPU usage to be at 100% for a longer period of time.
I am using System.Runtime.Caching.MemoryCache.
MVC ASP.NET application.
I have thought of a solution but I am not sure how to best implement it, my thoughts are,
One of the many users will first come in first, and start a request and I set a flag to say fetching data and then after any more users come in they will be shown no cache has been set but before starting the request they will check the flag if the request has already been triggered. If it has the application should wait until a response has come back (is this possible?), and then use the response from the cache.
This way One request is sent and it will be quicker response from the service and it will be a quicker response and CPU usage will still be quite low.
Please do suggest an alternative to this, my idea could be wrong
Can someone please advise?
Thanks
I'm working on a website that has a fairly good size database that is stored in memory. Because I don't yet understand how to do it otherwise, when the page is accessed, the database is read and certain data is stored in memory for faster performance while the user is on the page. What I don't like about it is the delay during the initial rendering of the page.
I'm sure there is a way for the data to be maintained in memory when that particular page is not loaded, but I don't know how to do it and I have not had any success yet using Google University (search). Can anyone recommend a link or search terms that will help me find out what I need to know?
Thanks all.
Store it in the Cache. You're still going to run into an issue of first-load slowness as it reads from DB and into Cache, but that's impossible to avoid. You can put it in the Application_Start event to guarantee it happens the first time the app is loaded.
http://msdn.microsoft.com/en-us/library/dd997357.aspx
I am just getting in to the more intricate parts of web development. This may not be in the best place. However, when is it best to get load balancing for a web project? I understand that it depends on good design/bad design as to how many users you can get to visit a site without it REALLY effecting the performance. However, I am planning to code a new project that could potentially have a lot of users and I wondered if I should be thinking off the bat about load balancing. Opinions welcome; thanks in advance!
I should not also that the project most likely will be asp.net (webforms or mvc not yet decided) with backend of mongodb or pgsql(again still deciding).
Load balancing can also be a form of high availability. What if your web server goes down? It can take a long time to replace it.
Generally, when you need to think about throughput you are already rich because you have an enormous amount of users.
Stackoverflow is serving 10m unique users a month with a few servers (6 or so). Think about how many requests per day you had if you were constantly generating 10 HTTP responses per second for 8 hot hours: 10*3600*8=288000 page impressions per day. You won't have that many users soon.
And if you do, you optimize your app to 20 requests per second and CPU core which means you get 80 requests per second on a commodity server. That is a lot.
Adding a load balancer later is usually easy. LBs can tag each user with a cookie so they get pinned to one particular target. You app will not notice the difference. Usually.
Is this for an e-commerce site? If so, then the real question to ask is "for every hour that the site is down, how much money are you losing?" If that number is substantial, then I would make load balancing a priority.
One of the more-important architecture decisions that I have seen affect this, is the use of session variables. You need to be able to provide a seamless experience if your user ends-up on different servers during their visit. Session variables won't transfer from server to server, so I would avoid using them.
I support a solution like this at work. We run four (used to be eight) .NET e-commerce websites on three Windows 2k8 servers (backed by two primary/secondary SQL Server 2008 databases), taking somewhere around 1300 (combined) orders per day. Each site is load-balanced, and kept "in the farm" by a keep-alive. The nice thing about this, is that we can take one server down for maintenance without the users really noticing anything. When we bring it back, we re-enable our replication service and our changes get pushed out to the other two servers fairly quickly.
So yes, I would recommend giving a solution like that some thought.
The parameters here that may affect the one the other and slow down the performance are.
Bandwidth
Processing
Synchronize
Have to do with how many user you have, together with the media you won to serve.
So if you have to serve a lot of video/files to deliver, you need many servers to deliver it. Let say that you do not have, what is the next think that need to check, the users and the processing.
From my experience what is slow down the processing is the locking of the session. So one big step to speed up the processing is to make a total custom session handling and your page will no lock the one the other and you can handle with out issue too many users.
Now for next step let say that you have a database that keep all the data, to gain from a load balance and many computers the trick is to make local cache of what you going to show.
So the idea is to actually avoid too much locking that make the users wait the one the other, and the second idea is to have a local cache on each different computer that is made dynamic from the main database data.
ref:
Web app blocked while processing another web app on sharing same session
Replacing ASP.Net's session entirely
call aspx page to return an image randomly slow
Always online
One more parameter is that you can make a solution that can handle the case of one server for all, and all for one :) style, where you can actually use more servers for backup reason. So if one server go off for any reason (eg for update and restart), the the rest can still work and serve.
As you said, it depends if/when load balancing should be introduced. It depends on performance and how many users you want to serve. LB also improves reliability of your app - it will not stop when one system goes crashing down. If you can see your project growing to be really big and serve lots of users I would sugest to design your application to be able to be upgraded to LB, so do not do anything non-standard. Try to steer away of home-made solutions and always follow good practice. If later on you really need LB it should not be required to change your app.
UPDATE
You may need to think ahead but not at a cost of complicating your application too much. Do not go paranoid and prepare everything to work lightning fast 'just in case'. For example, do not worry about sessions - session management can be easily moved to SQL Server at any time and this is the way to go with LB. Caching will also help if you hit some bottlenecks in the future but you do not need to implement it straight away - good design (stable interfaces), separation and decoupling will allow for the cache to be added later on. So again - stick to good practices, do not close doors but also do not open all of them straight away.
You may find this article interesting.
I've been working on a C# ASP.Net application that requires images to be customized by users. The images aren't very large, and so they are being stored in a database.
To facilitate loading them onto the pages, a single ASPX page has been created that depending on how it's posted to it loads a different image from the database.
The problem I've been seeing is that if a single page makes multiple requests (usually over 4), then each request starts getting a half second delay in the response.
I've added extra logging and run it through a performance analyzer and have not been able to find the source of the half second delays.
Question is:
What is this delay and how can I get rid of it?
-OR-
What is a better way of doing what I am trying to do that would avoid this entirely?
You're probably hitting a session lock. Disable the session if possible for these concurrent requests. For more information see:
ASP.NET MVC and Ajax, concurrent requests?
Underpinnings of the Session State implementation in ASP.NET
What "performance analyzer" are you referring to? Are you profiling your app? A profiler should tell you exactly where the time is going.
I have an asp.net web site with 10-25k visitors a day (peaks of over 60k before holidays). Pages/visit is also high, since it's a content site.
I have a few specific pages which generate about 60% of the traffic. These pages are a bit complex and are DB heavy (sql server 2008 r2 backend).
I was wondering if it's worth "caching" a static version of these pages (I hear this is possible) and only re-render them when something changes (about once in 48hs).
Does this sound like a good idea? Where would be the best place to implement this?
(asp.net, iis, db)
Update: Looks like a good option for me is outputcache with SqlDependency. I see a reference to some kind of SQL server notification for invalidating the cache, but I only see talk of SQL server 2005. Has this option been deprecated by Microsoft? Any new way to handle this?
Caching is a broad term that can happen at a number of different points. The optimum solution may be a combination of some or all.
For example, you can add page, or output caching as described here, which caches output on the web server, which I think is what you were referring to.
In addition, you can cache the data in memory using something like memcached, so that your data is more available to the web server as it builds the page, but you need to look at cache hit rate to know for sure that you are caching the right data.
Also, although slightly off the topic of improving db heavy pages, you can cache static resources that change infrequently like images, css and include files using a content delivery network. Any CDN will almost certainly have a higher bandwidth and a cheaper data plan than your own connection because of the economies of scale, so the more of your content you can serve from there the better, in general.
Your first question was "I was wondering if it's worth "caching" a static version of these pages". I guess the answer to that depends on whether there is a performance problem at the moment, and where the cause of that problem is. If the pages are being served quickly and reliably, then quite possibly it's not worth implementing caching. If there is a performance problem, then where is it? Is it in db read time, or is it in the time spent building the page once the data has been returned?
I don't have much experience in caching, but this is what I would try to do:
I would look at your stats and run some profiles, see which are the most heavily visited pages that run the most expensive SQL queries. Pick one or two of the most expensive pages.
If the page is pseudo static, that is, no data on it such as your logged in username, no comments, etc etc, you can cache the entire page. You can set a relatively long cache as well, anything from 1 min to a few hours.
If the page has some dynamic real time content on it, such as comments, you can identify the static controls and cache those individually. Don't put a page wide cache on.
Good luck, sounds like a cache could improve performance.
Caching may or may not help. For example, if a site has low traffic and if the caching is enabled, the server processes to create the cache before serving the request. And because the traffic is low, there can be enough delay between successive requests. So the cached version may even expire and the server again creates a new cached version. This process makes the response even slower than normal.
Read more: Caching - the good, the bad.
I have myself experienced this issue.
If the traffic is good, caching may help you have better load times.
Cheers
Aditya