Basically I have a website that will run a background task to perform some "maintenance" duties while it's not idle. When it is idle, these processes do not need to run.
Now I have a secondary website (virtual directory under main site) that needs to execute these tasks as well. However they can't both run them at the same time or it will cause issues.
Now the more correct solution would probably be to either merge the sites, break out the tasks into a different application, or change the tasks so that they do not conflict with each other if they're both running at the same time. For one reason or another, these are not (currently) options.
So basically when the secondary website is active, what would be the best way to make sure the primary website is awake and running these tasks?
I'm thinking the easiest solution would be to include a reference to the main website from the secondary website, so that any page load on the secondary website would force the first website to be server the request. Something like a 1px image.
But would this be better solved through IIS? Should they share the same application pool? Both applications are relatively stable, so I'm not too worried about one website bringing down the other.
Your question is a little confused. At one point you say that the two sites can't run the task at the same time, but then you say that when the secondary is active the primary should also be active?
Assuming your goal is to have the task run only in one place at a time, you basically having a locking problem. Some solutions would be:
Maintain a lock (e.g. physical file, database entry, ...) and only run the task if the other site isn't holding the lock.
Make the task callable in site A and then have site B call it rather than running the task itself. Site A can then track if it is already running the task.
All the solutions you listed yourself (especially separating background tasks away from websites altogether) which are better solutions than the above.
Hope that helps.
Kind of hard to depend on running an automated task on a web service. If the application pool is recycled then that task will become lost and the process could lose its integrity. There is no guarantee that the process will degrade nicely when the application pool is collected.
Running a dedicated task is better handled by a system which is set up to be dedicated. Namely, the server hosting this web service. I think you would be better suited to have the host service run directly on the server locally instead of exposed to the web as a web service.
However, if that is not an option, then you will probably benefit from merging them as you state. This way as long as the application pool is alive from the child service, the parent service will be available. Similarly, you could have them share an application pool. The issue here is as stated above, the integrity of the dedicated process could become compromised.
.NET is not really designed to run dedicated background tasks. That is better suited to a desktop application which supports the web service.
Related
I've got a web application that is a front-end to a very memory-intensive, multi-threaded image processor. The web application is running out of memory when it kicks-off large image processing tasks.
How can I run these tasks in a way (e.g. the background) that doesn't effect the memory allocated to the web application?
I'm unfamiliar with this sort of problem, but I would imagine there are techniques in .NET for doing this very thing. My web server is running ASP.NET MVC4, to give you a sense of the technology I'm targeting.
P.s. optimizing the image processor is not a concern, as the command-line interface to it works just fine.
Hey we have run into similar situations, it is never a good idea to load down your webserver with memory intensive tasks. The web server is very good at one thing, serving up web pages, if you ask it to do more you are carrying a lot of overhead (webserver) to do something that doesn't need the webserver to accomplish.
What we did is set up a message queue using Redis, not sure that this is the exact one, as there are different features you may want over another, but it is enough to get your started down the path.
https://github.com/ServiceStack/ServiceStack/wiki/Messaging-and-redis
This is not the only queue that you can use, I believe that MS has one as well. Then we set up a server in the cloud, which was much smaller that was listening on that same queue. Once the event came through. It processed it AND...
It was much quicker because there was no IIS overhead
One user did not impact the experience of another user
I could create two smaller machines to accoomplish what I was trying do with one larger.
HTH
I'm developing a .NET 4 application that requires a backend worker thread to be running. This thread consists mostly of the following code:
while (true) {
//Check stuff in database
//Do stuff
//write to database / filesystem
Thread.sleep(60000)
}
The ASP.NET app is just a frontend for the database.
My question is around where the best place to put this worker loop would be. It seems my immediate two choices would be (1) to spin it off from the Application_Start method, and just let it run, or (2) bundle it in a separate process (Windows service?)
(1) would obviously need some logic in the ASP.NET code to check it's still running, as IIS might kill it. It's also quite neat in that the whole application logic is in one, easily deployable package.
(2) is much more segregated, but feels a lot messier.
What's the best approach?
I would strongly opt for the Windows Service if possible. Background threading in ASP.NET comes with a lot of baggage.
The lifetime of your background process is at the mercy of IIS. If IIS decides its time to recycle the App Pool, your background process will restart. If IIS decides to stop the App Pool due to inactivity, your background process will not run.
If IIS is configured to run as a Web Garden (multiple processes per AppPool), then your background thread could run more than once.
Later on, if you decide to load balance your website (multiple servers running the site), then you may have to change your application to ensure the background threading is only happening on one server).
And plenty more.
Consider something simple like Hangfire and then think about the design points in this related answer.
Our company releases updates of our rich client application (written mainly with ASP.NET, WCF services, and ASP.NET AJAX) on our client's Windows Server 2008 IIS 7 web server. Once in awhile, we have large releases of updates. And sometimes there are bugs that users catch right after the release that are not caught during automation testing nor stage testing. Is there a way to smoothly deploy ASP.NET code on IIS 7 while users are still on without disrupting their workflows containing code that was not affected? I've found that if I just copy the code from stage (without the web.config) manually, and paste it into the production web root folder, nobody is really kicked off. But I'm wondering if there are any side-effects to this strategy for users diligently working in the application. I'm just wondering if any other connections may get interrupted or how they're even handled in this situation (ie: SQL connection, WCF Service calls, whether they keep the same session and if that will have any impact, etc..)? If I chose this method, I'd have something in the web.config that would display a message to every user (in the master page)--like a banner, that says "Please log off, and clear your cache", so they would see updates to issues addressed. But this would only be relevant to the impacted users.
If someone doesn't think this is a good strategy for minor updates, and has a better strategy, like changing a web.config setting that forces the user to a different server or something while the deployment is taking place. Or some other methodology, my ears are listening. Obviously the latter sounds safer, but I just don't know how this could be done. I've read about load balanced servers, but I think this type of server setup is done for different purposes, like if a server goes down, doesn't it? Or would this be the best solution as you take one site down? I'm welcome to any idears.
I used to stress about minimal-impact for releases too, but now we take it down. The reality is twofold:
You cannot guarantee that everything someone is working on right now is NOT something you're about to update. Consider this: A user is working on x.aspx and is in the middle of a postback. You drop a new x.aspx.
With enough notice, maintenance windows are a way of life. Users should expect that, from time to time, you need exclusive access to the application to make updates, etc.
It's just too hard to keep all the plates in the air when you really don't know what someone might be working on while you deploy. Especially if database updates are in the mix!
If there is a load balancer in the mix: you would remove the server(s) with the old code and add add the server(s) with the new code. This lets the traffic die out on the old code servers without kicking people out. The new server(s) picks up the traffic.
We do this with a new release to one server at a time until we replace all of the code in our server farm. It gives the application time to bake in the real world. If issues comes up, and they have, you only have to revert a single server. using the load balances makes it easier.
It is (usually) a seamless transition. Of course you need to make sure you app can handle any database changes etc.
I am writing a web application in ASP.NET 3.5 that takes care of some basic data entry scenarios. There is also a component to the application that needs to continuously poll some data and perform actions based on business logic.
What is the best way to implement the "polling" component? It needs to run and check the data every couple of minutes or so.
I have seen a couple of different options in the past:
The web application starts a background thread that will always run while the web application does. (The implementation I saw started the thread in the Application_Start event.)
Create a windows service that is always running
What are the benefits to either of these options? Are there additional options?
I am leaning toward a windows service because it is separated and can run on a different server (more scalable) as well as there is more control over when it is started/stopped, etc. However, I feel like the compactness of having the "background" logic running in the process of the web application might make the entire solution more understandable.
I'd go for the separate Windows service primarily for the reasons you give:
You can run it on a different server if necessary.
You can start and stop it independently of the web site.
I'd also add that it could well have some impact on the performance of the web site itself - something you want to avoid.
The buzz-word here is "separation of concerns". The web site is concerned with presenting the data to the user, the service with checking the integrity of the data.
You can also update the web site and service independently of each other should you need to.
I was going to suggest that you look at a scheduled task and let Windows control when the process runs, but I re-read your question and noted that you wanted the checks to run every couple of minutes. The overhead of starting the process might be too great in this case - though some experimentation would probably prove this one way or the other.
If you use a scheduled task there's also the possibility that you could start the next check before the current one has finished - something you can code for if you're in complete control.
Why not just use a console app that has no ui? Can do all that the windows service can and is much easier to debug and maintain. I would not do a windows service unless you absolutely have to.
You might find that the SQL Server job scheduler sufficient for what you want.
Console application does not do well in this case. I wrote a TAPI application which has to stay in the background and intercept incoming calls. But it did it only once because the tapi manager got GCed and was never available for the second incoming call.
I want to add a scheduled task to a client's ASP.NET app. These posts cover the idea well:
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
What is the Best Practice to Kick-off Maintenance Process on ASP.NET
"Out of Band" Processing Techiniques for asp.net applications
My question has two parts: First, will IIS unload the application if there isn't enough request activity despite the Cache activity? My client doesn't enjoy as much traffic as stackoverflow so they can't rely on user requests to keep the app 'active'. Obviously, I can't schedule tasks in an unloaded app.
Second, if so, is there a way to prevent IIS from unloading the app outside of configuration or external 'stay-alive' requests? My client's host doesn't allow much configuration tweaking and a stay-alive utility introduces the deployment complexity I'm trying to avoid with an ASP.NET Cache solution.
Thanks a bunch.
Edit/Conclusion: TheXenocide's solution is exactly correct given the question. However, I've decided it is a really bad question. The temptation to cut corners is always looming. I've regained my senses and told my client to use a website monitoring tool to keep the site active. In addition, the scheduled task is going in a windows service despite the extra deployment hassle.
Unfortunately, outside the range of changing timeout configuration (which I believe to be possible in Web.config, though I don't know what is and isn't allowed on hosting providers, most of which use Medium Trust) I don't believe there is any other method to keep the application from ending beyond web requests. One thing you might try that may be a little more simple than using some keep-alive service on a local machine might be to add some logic to Session_Start/Session_End that ensures there is always at least one session active; you can use the WebRequest class from within your application to call your own site and it should still start a new session.
Good luck, and let us know what you do :)
UPDATE: these details now very much depend on which version of IIS and which version of .NET you're running in. Newer versions of each have methods of configuring "always running" applications.