I am developing a site in ASP.net (C#) and have the following requirement:
The site should fetch data from an RSS feed every night, perform some calculations and update the DB with the calculated values. How can I achieve this in a shared hosting environment?
The answer that I usually get is to have a Windows Service that does this but I cannot use this as I am not allowed to run Windows Services in my shared hosting environment.
The other alternative that I found was to use the HttpRuntime.Cache as described in https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
However that approach seems to have a lot of cons. Is there any other approach?
At some (most?) shared hosting providers, at least the ones I have used, they allow you to schedule tasks through their control panel. I know discountasp.net does and rackspace as well.
If you provider does have that capability, have it call/load a certain asp.net web page at the designated time, and do its work. For a lot of small tasks, this will be the path of least resistance.
If it doesn't provide that capability, and you are not willing to switch providers, you can always run a scheduled task (use the built in one in windows server) from a machine that is NOT at your ISP, as long as you can get to the database from it. I have used this method as well in the past. Any machine that is reliably on at the right time will do.
Windows Scheduled Tasks:
Impossible in shared hosting if you don't have any dedicated resource. Very limited.
Using hacks like Windows Scheduled Tasks and Control Panel abilities are not nice solutions. They sucks most of the time, they was a headache for us due this years.
Scheduling Web Service:
You can use ATrigger scheduling service in shared hosting. A .Net library is also available to create scheduled tasks without overhead.
Disclaimer: I was among the ATrigger team. It's a freeware and I have not any commercial purpose.
Related
I am looking for an options to execute recurring background tasks. The background Task would call the external REST GET request and update the status accordingly in the application database.
Which one of the following would be appropriate, considering that we do not like to maintain separate web.config between the application and the scheduler/task app. Looking for Simple option in .NET/Asp.NET web API context - not looking for any separate installation / 3rd party.
Scheduled task - believe we need to create those many scheduled tasks in a server which points to those many databases? maintainability is a concern?
windows service
Asp.Net background task options
any other better option?
Please provide your insights for this question.
I highly recommend looking at Hangfire to implement background tasks
This works better than a windows service in a cloud environment and supports fire-and-forget and repeat tasks/processing etc and integration is really seamless.
I just noticed your non-3rd party comment, not sure if you mean commercial component, but this is free, via nuget, if that helps?
see: https://www.hangfire.io
Good day.
I'm wondering if the Enterprise Library Caching using isolated storage (disk, not DB) can be accessed by multiple apps in IIS? That is , can they all share the same instance of it.
I have various WCF services running on one machine, set up in different web apps (and potentially in different app pools, if that makes a difference). They all need access to a shared cache.
I had been told that this is possible with EntLib, but after doing some reading I'm not entirely sure this is the case. All of the services are running under NETWORK SERVICE user, but since they are all different apps in IIS does this prevent the sharing? I know having a different user certainly would.
So, can the same user use the same cache across multiple apps, or is it limited to within one app?
Any guidance would be appreciated!
If you want to share your cache across several services it would be better to go with App Fabric caching. See: http://msdn.microsoft.com/en-us/windowsserver/ee695849.aspx
I ended up not using EntLib for this and just used isolated storage.
In case anybody has the same problem, you can see the following question where I posted the code I used, as well as an issue I hit while using it plus the resolution.
Can't share isolated storage file between applications in different app pools
Basically we have many severs running many ASP.NET sites in different app pools. We roll updates every 2 weeks. My basic question comes down to this:
Is using the GAC codebase feature with a URL that we maintain the latest versions of the class libraries on an independent server a good approach to simple updates of all of these sites on all of these servers?
Is there any general pitfalls or potential issues that might arise with this?
I read the Download Cache is used on a per user basis. In this case would all the sites on the server simply use the version in the ASP.NET user download cache?
Would updates occur only when a site starts? What if One site is restarted and all the other sites are using the version in the Download Cache?
Is there anyway to also manage the .aspx/.js/.css/.ascx files in this manner?
I wouldn't like to trust that sort of update mechanism. It would be better to write a set of scripts, or use Web Deploy to push the updates to the servers in a much more controlled manner.
Simon
I have a website that's running on a Windows server and I'd like to add some scheduled background tasks that perform various duties. For example, the client would like users to receive emails that summarize recent activity on the site.
If sending out emails was the only task that needed to be performed, I would probably just set up a scheduled task that ran a script to send out those emails. However, for this particular site, the client would like a variety of different scheduled tasks to take place, some of them always running and some of them only running if certain conditions are met. Right now, they've given me an initial set of things they'd like to see implemented, but I know that in the future there will be more.
What I am wondering is if there's a simple solution for Windows that would allow me to define the tasks that needed to be run and then have one scheduled task that ran daily and executed each of the scheduled tasks that had been defined. Is a batch file the easiest way to do this, or is there some other solution that I could use?
To keep life simple, I would avoid building one big monolithic exe and break the work to do into individual tasks and have a Windows scheduled task for each one. That way you can maintain the codebase more easily and change functionality at a more granular level.
You could, later down the line, build a windows service that dynamically loads plugins for each different task based on a schedule. This may be more re-usable for future projects.
But to be honest if you're on a deadline I'd apply the KISS principle and go with a scheduled task per task.
I would go with a Windows Service right out of the gates. This is going to be the most extensible method for your requirements, creating the service isn't going to add much to your development time, and it will probably save you time not too far down the road.
We use Windows Scheduler Service which launches small console application that just passes parameters to the Web Service.
For example, if user have scheduled reports #388 and #88, scheduled task is created with command line looking like this:
c:\launcher\app.exe report:388 report:88
When scheduler fires, this app just executes web method on web service, for example, InternalService.SendReport(int id).
Usually you already have all required business logic available in your Web application. This approach allows to use it with minimal efforts, so there is no need to create any complex .exe or windows service with pluggable modules, etc.
The problem with doing the operations from the scheduled EXE, rather than from inside a web page, is that the operations may benefit from, or even outright require, resources that the web page would have -- IIS cache and an ORM cache are two things that come to mind. In the case of ORM, making database changes outside the web app context may even be fatal. My preference is to schedule curl.exe to request the web page from localhost.
Use the Windows Scheduled Tasks or create a Windows Service that does the scheduling itself.
Jeff has previously blogged about using the cache to perform "out of band" processing on his websites, however I was wondering what other techniques people are using to process these sorts of tasks?
Years ago, I saw Rob Howard describe a way to use an HttpModule to process tasks in the background. It doesn't seem as slick as using the Cache, but it might be better for certain circumstances.
This blog post has the details, and there are many others that capture the same information if you look around.
Windows Service
You may want to look at how DotNetNuke does it. I know it is written in VB.NET, but I retrofitted the code into C#. I was perusing the source and noticed they had a feature in their admin area to setup scheduled tasks. These tasks get setup thru the admin interface and stored in the database. When the site starts, thru the Global.asax file, they either created another thread to run this service that then runs the scheduled tasks at their scheduled time. I can't remember the exact logic, it's been a while, but it is definitely a good resource on how other people have done out of band processes for Asp.Net applications. This technique still keeps the logic within the Asp.Net application, but it runs out of band in my opinion.
if it's primarily data processing tasks and you're using MSSQL, how about scheduled SSIS tasks?
Scheduled tasks using http://www.codeproject.com/KB/cs/tsnewlib.aspx or schtasks.exe.
Quartz.NET
MSMQ
SQL Server jobs
Windows service
System.Threading.Timer or System.Timers.Timer
System.ComponentModel.BackgroundWorker
Asynchronous calls and callbacks
Scheduled tasks, or cron jobs.
The problem with scheduled tasks or cron jobs is that they don't share memory space with the web server. You could set up a scheduled task that requested pages from the web server, but that might create problems with long running tasks. It would be nice to have some low priority threads running on the actual ASP.Net application stack to do simple utility tasks like cleaning up caches, monitoring resources, and just to deal with general housekeeping.
Simple queue files along with a separate agent. For each type of out of band process write a separate agent .exe which watches a directory for queue files that include whatever data is needed to perform the specified process.
This may seem dirty but in the real world I find it gives a lot of flexibility, you aren't doing a lot of processing in ASP.net process space and you could easily adapt this style to farm processing out to cheap Linux servers running the agent process on Mono for when you start needing more RAM/CPU/disk.
If you are most comfortable with asp.net pages you can write a small app to handle your job and then "ping" the app with an outside service that monitors your web site. This will keep the app alive.