ASP.net: Best way to hold global settings - asp.net

I need to store settings that update from time to time about my website and was wondering what is the most efficient way to do this. Note that once these settings are changed, I need them to be accessible even after an IIS reboot, so a simple application variable is not the answer. I also know I can store application settings in my config file, but as you change them, the actual XML doesn't seem to update, so I would have to re-write the XML behind the scenes each time it changes.
As for writing to an file, such as an INI, my concern arises if as I am writing, another is trying to read. I have had IO locking errors before doing such a thing. Also I can do a database storage, but trying to keep database calls low.
Right now I am set on probably having an INI that I write to on change, on application load I pull from that so that I can always access the local variable. This would make IO issues pretty unlikely since I am only reading once. Just basically looking for some input on what is likely the most efficient way of doing this.
Thanks in advance,
Anthony F Greco

If you want to minimize database calls, I'd say put the values in the database, then cache them in your application. Depending on your needs, maybe expire the cache every 30 minutes, so when you change the DB value, it will be applied in no more than 30 minutes, and you will only make a DB call every 30 minutes (or when needed, like when your app restarts). Or you can use a SQL Cache Dependency, but I've never done that, so I don't know the pitfalls of that technique.

We moved all of our application settings from the web.config to the database as can be seen HERE.
Ours was done to get around the problems with promoting code from Dev to Test to QA to Prod, but it can be used for other reasons.
Ours is cached on application startup and then every 5 minutes it checks to see if a single counter in a table was updated. If so, it refreshes all of the settings.

I'd go with #Joe Enos
Except that I'd use an xml file or a .ini, as you'd like to limit your db calls.

Related

Handle ActionResults as cachable, "static content" in ASP.NET MVC (4)

I have a couple of ActionMethods that returns content from the database that is not changing very often (eg.: a polygon list of available ZIP-Areas, returned as json; changes twice per year).
I know, there is the [OutputCache(...)] Attribute, but this has some disadvantages (a long time client-side caching is not good; if the server/iis/process gets restartet the server-side cache also stopps)
What i want is, that MVC stores the result in the file system, calculates the hash, and if the hash hasn't changed - it returns a HTTP Status Code 304 --> like it is done with images by default.
Does anybody know a solution for that?
I think it's a bad idea to try to cache data on the file system because:
It is not going to be much faster to read your data from file system than getting it from database, even if you have it already in the json format.
You are going to add a lot of logic to calculate and compare the hash. Also to read data from a file. It means new bugs, more complexity.
If I were you I would keep it as simple as possible. Store you data in the Application container. Yes, you will have to reload it every time the application starts but it should not be a problem at all as application is not supposed to be restarted often. Also consider using some distributed cache like App Fabric if you have a web farm in order not to come up with different data in the Application containers on different servers.
And one more important note. Caching means really fast access and you can't achieve it with file system or database storage this is a memory storage you should consider.

ZEO: How can I make lets clients in read-only but make changes through the console

I'm running a script by pasting it in a console like this:
bin/client2 debug
... my script ...
The script normalizes titles of the files. Since there are more than 20k files it takes really too much to time. So I need users still can use the site but in a read-only fashion.
But I assume that setting read-only true in zeo.conf would not let me run my normalization script. Wouldn't it?
How can I solve this?
Best regards,
Manuel.
There isn't, I'm afraid.
If your users alter the site when logged in, disable logging in for them until you are done.
Generally, for tasks like these, I run the changes in batches, to minimize conflicts and allow end-users to continue to use the site as normal. Break your work up in chunks, and commit after every n items processed.
You can add another zeo client that is not RO--it's not required that a zeoserver be RO to have the clients RO.
So, all the clients that are being used, make RO, and then add an additional RW client that isn't used by anyone but your script and then leave the zeoserver RW.

Synchronizing local cache with external application

I have two separate web applications:
The "admin" application where data is created and updated
The "public" application where data is displayed.
The information displayed on the "public" changes infrequently, so I want to cache it.
What I'm looking for is the "simplest possible thing" to update the cache on the public site when a change is made in the admin site.
To throw in some complexity, the application is running on Windows Azure. This rules out file and sql cache dependencies (at least the built in ones).
I am running both applications on a single web role instance.
I've considered using Memcached for this purpose. but since I'm not really after a distributed cache and that the performance is not as good as using a memory cache (System.Runtime.Caching) I want to try and avoid this.
I've also considered using NServiceBus (or the Azure equivalent) but again, this seems overkill just to send a notification to clear the cache.
What I'm thinking (maybe a little hacky, but simple):
Have a controller action on the public site that clears the in memory cache. I'm not bothered about clearing specific cached items, the data doesn't change enough for me to worry about that. When the "admin" application makes a cache, we make a httpwebrequest to the clear cache action on the public site.
Since the database is the only shared resource between the two applications, just adding a table with the datetime of the last update. The public site will make a query on every request and compare the database last update datetime to one that we will hold in memory. If it doesn't match then we clear the cache.
Any other recommendations or problems with the above options? The key thing here is simple and high performance.
1., where you have a controller action to clear the cache, won't work if you have more than one instance; otherwise, if you know you have one and only one instance, it should work just fine.
2., where you have a table that stores the last update time, would work fine for multiple instances but incurs the cost of a SQL database query per request -- and for a heavily loaded site this can be an issue.
Probably fastest and simplest is to use option 2 but store the last update time in table storage rather than a SQL database. Reads to table storage are very fast -- under the covers it's a simple HTTP GET.
Having a public controller that you can call to tell the site to clear its cache will work as long as you only have one instance of the main site. As soon as you add a second instance, as calls go through the load balancer, your one call will only go to one instance.
If you're not concerned about how soon the update makes it from the admin site to the main site, the best performing and easiest (but not the cheapest) solution is to use the Azure AppFabric Cache and then configure it to use a a local (in memory) cache with a short-ish time out (say 10 minutes).
The first time your client tries to access an item this would be what happens
Look for the item in local cache
It's not there, so look for the item in the distributed cache
It's not there either so load the item from persistent storage
Add the item to the cache with a long-ish time to live (48 hours is the default I think)
Return the item
Steps 1 and 2 are taken care of for you by the library, the other bits you need to write. Any subsequent calls in the next X minutes will return the item from the in memory cache. After X minutes it falls out of the local cache. The next call loads it from the distributed cache back into the local cache and you can carry on.
All your admin app needs to do is update the database and then remove the item from the distributed cache. The next time the item falls out of the local cache on the client, it will simply reload the data from the database.
If you like this idea but don't want the expense of using the caching service, you could do something very similar with your database idea. Keep the cached data in a static variable and just check for updates every x minutes rather than with every request.
In the end I used Azure Blobs as cache dependencies. I created a file change monitor to poll for changes to the files (full details at http://ben.onfabrik.com/posts/monitoring-files-in-azure-blob-storage).
When a change is made in the admin application I update the blob. When the file change monitor detects the change we clear the local cache.

Global settings for ASP.NET website

I'm trying to create a sort of global settings for a website and store this data on the database, however I keep thinking that may not be very efficient as these settings will have to be read on every request.
The type of settings like 'how many records to show per page', enable/disable things, I plan to store this on the database but don't want the overhead of having to call the database on every request to get the settings, specially when they don't change. Surely this is done all the time on CMS's, how do you think it should be done. I am thinking SqlCacheDependency but never set that up. Is there another way?
Also on the cards is the possibility to store those settings on web.config and create a GUI for it, the problem is that the administration of the site runs on it's own namespace and has it's own web.config, so the question here is if it's possible to manipulate a web.config outside the application namespace.
Thanks guys.
I would suggest to you to read the values from the DB in the Application_Start and store these values in the Application object. In this way you will not need to go to the DB to read the values every time. It will only read and store values once when the application starts.
void Application_Start(object sender, EventArgs e)
{
Application["name"] = ""; Value from DB
......................
......................
}
Note: It is not recommend to manipulate values in the web.config using UI, because once a user tries to modify any value from the UI, that changes the value in the web.config, all your User session will be terminated.
Another Note: Every time you change the information and update your DB, you will need to update the Application level object as well.
First, I wouldn't start worrying about counting database calls until you have an idea that your database calls are actually hurting performance. Modern databases are quick, especially for well-designed queries for scalar values. Lots of popular packages read config on every request and they seem to scale pretty well.
As for updating these values, you can pretty easily update a web.config file from another app presuming you've got the right permissions -- this will definitely require Full Trust which leaves out most hosting scenarios. Thing to remember is that, while VS treats the file special, it is just an XML file so the normal tricks for updating an XML file will apply. Because you are doing this from a different application, it will work. But Muhammad's warning about dumping user sessions would apply to the "victim" of the update. Which might be OK depending on what you are changing.

long running http process - how to put in separate process?

I know that similar questions have been asked all over the place, but I'm having trouble finding one that relates directly to what I'm after.
I have a website where a user uploads a data file, then that file is transformed and imported into SQL. The file could be up to 50mb in size, and some times this process can take 30 minutes or sometimes even longer.
I realise I need to palm off the actual work to another process, and poll that process on the web page. I'm wondering what the best approach would be though? Being a web developer by trade, I'm finding all this new Windows Service stuff a bit confusing, and I just wanted somewhere to start.
So:
Can I do / should I being doing this with a windows service? if so, how?
Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
clarifications
The data is imported into sql, there's no file distribution taking place.
If there is a failure, it absolutely MUST be reported to the user. The web page will poll every, lets say, 5 seconds, from the time the async task begins, to get the 'status' of the import. Once it's finished another response will tell the page to stop polling for status updates.
queries on final decision
ok, so as I thought, it seems that a windows service is the best idea. So as to HOW to get it to work, it seems the 'put the file there and wait for the service to pick it up' idea is the generally accepted way, is there a way I can start a process run by the service, without it having to constantly be checking a database table / folder? As I said earlier, I don't have any experience with Windows Services - I wondered if I put a public method in the service, can I call it somehow?
well ...
var thread = new Thread(() => {
// your action
});
thread.Start();
but you will have problems with that:
what if the import to sql fails? should there be any response to the client
if it fails, how do you ensure the file on a later request
what if the applications shuts down ... this newly created and started thread will be killed either
...
it's not always a good idea to store everything in sql (especially files...). if you want to make the file available to several servers why not distribute them via ftp ...?
i believe that your whole concept is a bit messed up (sry assuming this), and it might be helpful if you elaborate and give us more information about your intentions!
edit:
Can I do / should I being doing this
with a windows service? if so, how?
you can :) i advise you to create a simple console-program and convert this with srvany and sc. you can get a rough overview howto here (note: insert blanks after =... that's a silly pitfall)
the term should is relative, because you did not answer the most important question
what if a record is persisted to the database, telling a consumer that file test.img should be persisted, but your service hasn't captured it or did not transform it yet?
so ... next on
Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
you probably could create a WCF-service which recieves some binary-data and then stores this to a database. this request could be async. yes. but what for?
once again:
please give us more insight to your workflow: what are you exactly trying to achieve? which "environmental-conditions" to you have (eg. app A polls db and expects file-records which are referenced in table x to be persisted) ...
edit:
so you want to import a .csv-file. well that changes everything :)
but i won't advise you to use a wcf-service (there could be a usage: eg. a wcf-service which has a method to insert a single row, then your iteration through the file would be implemented in another app... not that good, though).
i would suggest following:
at first do everything in your webapp (as you've already done), but rather use some sort of bulk-insert and do your transformation/logic on the database.
if you have some sort of bottle-neck then, i would suggest you something like a minor job-service, eg:
webapp will upload the file and insert a row to a job-table. the job-service is continiously polling the table/or gets informed via wcf by the webapp (hey, hey, finally some sort of usage for WCF in your scenario... :) ) and then does the import-job, writing a finish-note to a table/or set the state of the job to finished ...
but this is a bit overkill :)
Please see if my below comments helps you to resolve your issue:
•Can I do / should I being doing this with a windows service? if so, how?
Yes you can do this with a windows service. And I think that is the way you should be doing it. You can implement your own service to process your request or you can use the open source code Job Proccessor
Basically the idea is..
You submit a request for processing
the csv file in database table with
some status as not started.
Then your windows service picks up
the request from database table which
are not started and update them as in
progress status.
Once the processing is complete
succesfully /unsuccesfuly your
service updated the database table
with status as Completed / Failed.
And your asp.net page can poll to
database table for the current status
every 5 sec or so.
•Should I use WCF? If this runs under IIS, will I have problems with aspnet_wp.exe recycling and timing out my process?
you should not be using WCF for this purpose.

Resources