I need to store 2 values (counters) for my ASP.NET web app. The counters always grow, they should never return to 0. So one option would be to save them in DB, what other options do I have, because storing them in a table seems disproportionate? Session is not an option because the counters have to survive app restarts.
Thanks :-)
Storing the counts in a DB table sounds perfectly appropriate here.
The other options which come to mind are to use a file, which would not be very reliable ACID wise, or memory, using something like memcache, which would not survive a system reboot.
Don't worry about using a DB table. It will hardly take up any storage space or incur any significant overhead unless they are being written to very frequently. If that's the case, please add more info and we may be able to propose other solutions.
If these counters are rarely updated, and can be machine dependent (you're not in a cluster) then I'd use something simple, like writing their values to a Settings file. Keep in mind you'll have to cope with multi-threading.
If there's a lot of access to the counters, store them in a database.
Save them in a file. Fastest way.
Related
A website is serving continuously updated content (think stock exchange), is required to generate reports on-demand and files get downloaded by users. Users can customize the downloaded report based on lots of parameters.
What is the best practice in handling highly customized reports downloaded files as (.xls)?
How to cache and improve performance ?
It might be good to mention that the data is stored in RavenDb and the reports are expected to handle 100K results sizes.
Here are some pointers:
Make sure you haven static indexes defined in RavenDB to match all possible reports. You don't want to use dynamically generated temp indexes for this.
Probably one or more parameters will drastically change the query, so you may have some conditional logic to choose which of several query to run. This is especially true for different groupings, as they'll require a different map-reduce index.
Choose whether you want to limit your result set using standard paging with Skip and Take operators, or whether you are going to stream unbounded result sets.
However you build the actual report, do it in memory. Do not try to write it to disk first. Managing file permissions, locks, and cleanup is not worth the hassle. Plus, you risk taking servers down if they run out of disk space.
Preferably you should build the response and stream it out to your user in a single step, as to not require large amounts of memory on the server. Make sure you understand the yield keyword in C#, and that you work with IEnumerable and IQueryable directly whenever possible. Don't try to use .ToList() or .ToArray(), which will put the whole result set into memory.
With regard to caching, you could consider using a front-end cache like Memcached, but I'm not sure if it will help you here or not. You probably want as accurate of data that's possible from your database. Introducing any sort of cache will require you understand how and when to reset that cache. Keep in mind that Raven has several caching layers built in already. Build your solution without cache first, and then add caching if you need it.
I'm creating an ASP.NET MVC app that uses EF to perform all DB tasks.
There's a couple of related tables in the database that never change and I was thinking on creating a static collection that retrieves the data from those two tables (it's a few hundred records) the first time it is requested and just stores it in an object to prevent hitting the database every time.
Since I've read several people saying that you should avoid static objects in ASP.NET I was wondering if this was a bad practice or if it is acceptable for scenarios like this (read-only and small amount of data which should prevent concurrency problems).
Also I would like to know if there are other better alternatives to do this.
Thanks.
I have done exactly what you are planning on doing for exactly the same reason. It has been working well for several years already.
Just make sure that you get the initialization of the data right and you should be fine. When initializing, keep in mind:
Don't use locking if at all possible (or your app will deadlock 2
minutes before you're going on vacation)
You MUST NOT under any circumstance let a static constructor fail
Make sure no consumer of your cache has the ability to modify it
If the data isn't really static and you would actually need to re-read it fairly often then this might not be the best solution.
Just in case you're wondering, I've used this approach to cache for instance country data, currency data (base data, not rates), sales unit data (pcs, m, kg etc). These are all stored in a database but almost never change.
It is not a very good approach to use static objects. I would use something like RavenDB which can be used to store your settings or DB data in-memory. It has a very small footprint and is very fast. It has full LINQ support.
Say I need to populate 4 or 5 dropdowns w/ items from a database. Each drop down will have < 15 items in it. These items almost never change.
Now I could query the DB each time the page is accessed or I could grab the values from a custom class that would check to see if they already exist in ASP.Net's cache and only if they don't query the DB to update the cache.
It's trivial for me to write but I'm unsure if the performace would be better or not. I think it would be (although not likely anything huge).
What do you think?
When dealing with performance issues you should always:
Do things the simplest way first (avoid premature optimisation)
Performance test your code with set performance goals (e.g. 200ms response time under load of N concurent users)
Then, IF your code doesn't perform then profile your code to determine what is slow, and profile your proposed performance fixes to accurately measure what the real-world performance change will be.
Having said that then yes, what you are suggesting seems sensible (you would usually expect an in-memory cache to be quicker than a database), however it also depends on what data is being returned, what the memory load of your application is, how expensive the query is, what the query parameters are etc...
You should performance test your changes before and after to determine the actual effect of your changes (including things like memory load), and you should only really be doing things like this once you have identified that these dropdowns are the cause of an unacceptable performance problem.
That's what System.Web.Helpers.WebCache class exists for.
IO is usually more expensive than memory operations (by orders of magnitude). Especially if your database is in another machine, then you would even be using network resources, and it will definitely be faster to just use the cache.
But indeed, optimize in the end when you have really identified it as a performance bottleneck by measuring.
Quick answer to your question:
Use the built in .Net cache.
Additional points to ponder over..
Preferably, retrieve all master data in a single database retrieval (think stored procedure and dataset): though, I do not advocate the used of stored procs in all scenarios.
As you rightly said, ensure that your data access layer checks the cache before making a round trip to the database
Also, as your drop down values do not change very often; do remember to keep a long expiry duration
Finally, based on your page design you could also look at Fragment Caching (partial page caching: user controls) which could give you bigger benefits since now you neither access the data cache nor the database.
Performance:
Again, the performance depends more on the application's load as compared to your direct round trips for fetching the master data. Put simply, As Thomas suggested use the cache class!
I'm doing a survey builder, and I think to store the survey in session with a unique guid key until the user creates it fully and saves it I'm thinking it is going to be an array of 100~200 objects (8 properties class)
That sounds like a fair use of Session.
Whether your data is too large depends on quite a few things, such as your web server's memory. The best thing to do is test the performance using Session. If you find your data is too heavy for Session have a look at ASP.NET Profile.
That doesn't sound like that much data unless we are talking about reams of text for each answer. I would not worry about it unless I was working on a web site expected to have thousands of these open at any given point in time.
IMHO I think the data should be stored in something other than the Session.
Session objects can disappear for a myriad of reasons. Would your users be annoyed if their answers are not persisted and needs to start afresh.
Remember to write the data to a persistent store (DB, files, etc) as soon as possible unless the users don't mind starting over.
I agree with ggonsalv. I would store the data somewhere just in case the session is lost. I have been to sites where I fill stuff out and then it looses it near the end. It's not fun to start over.
What is preferable, keeping a dataset in session or filling the dataset on each postback?
That would depend on many factors. It is usually best not to keep too many items in session memory if your session is inproc or on a state server because it is less scalable.
If your session resides on the database, it might be better to just requery and repopulate the dataset unless the query is costly to execute.
Don't use the session!!! If the user opens a second tab with a different request the session will be reused and the results will not be the ones that he/she expects. You can use a combination of ViewState and Session but still measure how much you can handle without any sort of caching before you resort to caching.
It depends how heavily you will do that and where your session data is stored on the server.
Keeping the datasets in session surely affects memory usage. All of the session variables are stored in memory, or in SQL server if you use SQL Server state. You can also offload the load to a different machine using the State Server mode.
The serialization/deserialization aspect. If you use insanely large datasets it could influence server seide performance badly.
On the other hand having very large viewstates on pages can be a real performance killer. So I would keep datasets in session or in the cache and look for an "outproc" way to store that session data.
Since I want as few database operations as possible, I will keep the data in session, but I'm not sure what would be the best way to do it. How do I handle concurrency and also since the data is shared when two users simultaneously use the page how can I make sure each of them have their own set of data in the session?
I usually keep it in session if it is not too big and/or the db is far and slow. If the data is big, in general, it's better to reload.
Sometimes I use the Cache, I load from Db the first time and put it in the cache. During postback I check the chache and if it is empty I reload.
So the server manage the cache by itself.
the trouble with the session is that if it's in proc it could disappear which isn't great, if it's state server then you have to serialize and if it's sql sate your're doing a round trip anyway!
if the result set is large do custom paging so that you are only returning a small subset of the total results.
then if you think more than one user will see this result set put each page in the cache as the user pages making sure that the cache is renewed when the data changes or after a while of it not being accessed.
if you don't want to make many round trips and you think you've got the memory then bung the whole dataset in the cache but be careful it doesn't melt the web server.
using the cache means users don't need their own set of data rather they use the global set.
as far as concurrency goes when you load up the insert/ edit page you need to populate it with fresh data and renew the cache after the add/update.
I'm a big believer in decoupling and i rarely, if ever, see the need to throw a full dataset out to the user interface.
You should really only pass objects to the UI which needs to be used so unless you're showing a big diagram or some sort of data structure which needs to display relationships between data it's not worth the cost.
Smaller subsets of data, when applicable, is far more efficient. Is your application actually using all features within a dataset on the UI? if not, then the best way to proceed would be to only pass the data out that you're displaying.
If you're binding data to a control and sorting/paging etc through it, you can implement a lot of the interfaces which enables the dataset to support this, in a lot smaller piece of code.
On that note, i'd keep data, which is largely static (e.g. it doesn't update that often) in the cache. So you need to look at how often the data is updated before you can really make a decision for this.
Lastly, i'll say this again, i see the need to utilise a dataset in the UI very, very rarely..it's a very heavy data object and the cost benefits of looking at your data requirements, versus ease of implementation, could far outweigh the need to cache a dataset.
IMHO, datasets are rather bad to use if you're worried about performance or memory utilisation.
Your answer doesn't hint at the use of the data. Is it reference data? Is the user actively updating it? Is more than one user meant to have update access at any one time?
Without any better information than you provided, go with the accepted wisdom that says keeping large amounts of data in session is a way to guarantee that your application will not scale and will require hefty resources to serve a handful of people.
There's are usually better ways to manage large datasets without resorting to loading all the data in-memory.
Failing that, if your application data needs are truly monsterous, then consider a heavy client with a web service back-end. It may be better suited than making a web page.
As other answers have noted, the answer "depends". However I will assume that you are talking about data that is shared between users. In that case you should not store the dataset in the user's session, but repopulate on postback.
In addition, you should populate a cache near the data access layer of your application to increase interactive performance and reduce load on the database. The design of your cache will again depend on the type of data (read-only vs. read/write vs. read-mostly).
This approach decouples the user session (a UI-layer concept) from data management, and provides the added benefit of supporting shared use of cached data (in cases where data is common across users).
Neither--don't "keep" anything! Display only what a user needs to see and build an efficient paging scheme to see rows backwards or fowards.
rp
Keep in session. Have option to repopulate if you need to, for example if the session memory is wiped.