I am planning on a project to schedule scripts on multiple Windows and Linux servers. I'm kind of going down the path of doing this all from scratch because I have requirements which alternative software don't seem to meet (such as running tasks on completion or failure of other tasks and being able to schedule on non standard intervals).
I was thinking about having a web interface which will allow users to add/modify/delete schedules for each machine to a database.
A windows service will then be checking the database for any jobs that need to be run at that point and connect over SSH for Linux or PowerShell for windows. All the scripts will write back to the database on their progress so that they can be checked by the user.
Basically I just wanted some advice from people who knows better ways or things I may need to look out for which could cause problems because I don't have much experience.
Thanks.
Oracle Scheduler has all options where you are looking for and probably more. See Overview of Oracle Scheduler for some global info. It comes doen to having a central schedular database that submits jobs to remote job agents that do the work pretty much independent from the central schedular repository. It does report back status etc. when the repository is accessible after a job has finished.
It's a very powerful tool and it takes away a lot of complex tasks for you by giving a framework that you can start using right out of the box.
Related
In an ASP.NET MVC4 project, I want to update some data in sql server(2012) on 00:00 everyday.
Maybe I have three choice:
1. writing an windows service which running on the server and execute database updating.
2. writing an sql server Stored Procedure which execute on 00:00 everyday.
3. use Third-party tools,like Quartz.Net, fluent.
Which one is the best choice? Why?
If you're comfortable with T-SQL and the task can be completely entirely within the database, then it makes sense to implement it as a stored proc and schedule it using SQL Server Agent. This reduces the number of external dependencies as everything happens inside SQL Server and there are less points of failure (e.g. it will still run if IIS or your web solution is down).
If your update needs to interact with other resources, such as importing a text file from a known location, etc., then you might also consider implementing it in SSIS (SQL Server Integration Services). This has the advantage of still being SQL hosted, while giving you access to additional functionality not easily achieved with a stored proc.
I would only implement as a batch process in .NET if I felt that the functionality required would be difficult or awkward to implement in a proc or SSIS package. This is especially relevant since SQL Server 2012 allows you to build an SSIS package using .NET type code, but it "lives" in the SSIS package that is registered on the SQL Server.
I wouldn't implement it within your ASP.NET solution at all unless it actually needs a web based user interface for some reason. The fact that you want to execute this process at precisely the same time every night tells me that this does not require human interaction.
Where a process can be fully automated, avoid putting a user interface or anything which can potentially hang or become a single point of failure while waiting for some kind of UI input. Remember, a presentation layer is for human interaction - consider if you need one. Better to implement it as a batch of some sort, which also makes it easier to execute via automation. There are exceptions to this rule, e.g. if you wanted to implement your update as a web API with a REST interface or the like, but as a general rule it holds true.
As a side note, if your aim is to run your process overnight, consider scheduling it in the early hours of the morning (between 3 and 4 am) rather than midnight as this is generally when most people are asleep and your update is least likely to impact the availability of your app and its database, or if your update is long running, run into an edge case like a daylight savings change or conflict with other overnight processes.
Check out Hangfire. Scheduled tasks in ASP.NET. Super easy and reliable.
http://hangfire.io/
We just started using it and I like it.
I used all three ways you write, all have their pros and cons. But I suggest you to use Quartz.Net. It is very easy to impliment and very easy to use.
You can see here a wonderful article by mike on Scheduled tasks in ASP.NET
I am developing a site in ASP.net (C#) and have the following requirement:
The site should fetch data from an RSS feed every night, perform some calculations and update the DB with the calculated values. How can I achieve this in a shared hosting environment?
The answer that I usually get is to have a Windows Service that does this but I cannot use this as I am not allowed to run Windows Services in my shared hosting environment.
The other alternative that I found was to use the HttpRuntime.Cache as described in https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
However that approach seems to have a lot of cons. Is there any other approach?
At some (most?) shared hosting providers, at least the ones I have used, they allow you to schedule tasks through their control panel. I know discountasp.net does and rackspace as well.
If you provider does have that capability, have it call/load a certain asp.net web page at the designated time, and do its work. For a lot of small tasks, this will be the path of least resistance.
If it doesn't provide that capability, and you are not willing to switch providers, you can always run a scheduled task (use the built in one in windows server) from a machine that is NOT at your ISP, as long as you can get to the database from it. I have used this method as well in the past. Any machine that is reliably on at the right time will do.
Windows Scheduled Tasks:
Impossible in shared hosting if you don't have any dedicated resource. Very limited.
Using hacks like Windows Scheduled Tasks and Control Panel abilities are not nice solutions. They sucks most of the time, they was a headache for us due this years.
Scheduling Web Service:
You can use ATrigger scheduling service in shared hosting. A .Net library is also available to create scheduled tasks without overhead.
Disclaimer: I was among the ATrigger team. It's a freeware and I have not any commercial purpose.
I have an ASP.NET application that is consistently using 75% - 100% of the CPU on a production server. How can I profile the application to figure out what part of the code is using up the most CPU? I have looked at a couple of different tools (Xte Profiler, EQATEC, dotTrace), but they all seem to want you to load and run the application within their tool. It seems to me that they want you to load up the application in their tool and run tests locally (not in production). I want to profile the application while it is running in production with people hitting it to see what is actually going on. Is this possible?
I am a newbie to application profiling so forgive me if I have missed something obvious or am not thinking about this correctly.
Thanks,
Corey
Sam Saffron (one of the StackoverFlow creators) has written a great command-line tool a while ago, but unfortunately has abandoned it.
A friend of mine forked the code to make it work in 2015:
https://github.com/jitbit/cpu-analyzer
(the page has a link to Sam's post explaining how to use it)
The great thing about this tool (besides "no-install required" portability, cmd-line interface, etc etc) is that APM packages like NewRelic etc only monitor http-requests. If your app has some background threads - they won't help much.
You should consider taking a memory dump on the production server while it's experiencing high CPU. Check out ADPlus and taking a hang dump on the asp.net process. This can then be analyzed with Windbg or other tools.
I just went through a similar experience where our production servers were experiencing excessive CPU load - a scenario we could not recreate locally or in test/staging environments. It had nothing to do with the database (database CPU was normal). Analyzing the dump file is what clued us in on what was causing the problem (excessive compilation of regex objects by some library we were using).
This answer would be incomplete without Tess' blog, so here's the link.
My guess it has to do with long running database queries rather than the ASP.net application itself. In my experience 9 times out of 10 this is what I see and this takes the APPLICATION server down to a crawl as resources are consumed and the app has to wait for each query to finish to move on. Take a look at SQL profilier on the DB server and see if there are any queries that are taking a long time to execute.
It could be as simple as adding an index to a column or some other small minor optimizations. Once you know the query, you can then also go back to your code and tweak that section as well.
For those who stumble upon this question still, it really depends on what you are trying to accomplish.
If a server is running that high on CPU, odds are, a standard profiler will bring it to a grinding halt due to it's additional overhead.
There are actually three different types of profilers. Standard profilers, lightweight transaction profilers, and APM tools. You can read more about this in my blog post that discusses all 3:
.NET Profilers: 3 types and why you need all of them
It's certainly possible to profile ASP.NET with the EQATEC Profiler. See:
Profiling ASP.NET websites with EQATEC Profiler
EQATEC Profiler instruments your app in a separate step that enable the app itself to collect it's own profiling info, and the profiler then merely displays that timing data afterwards.
That means that you can run your instrumented ASP.NET app completely independent of the profiler itself.
You could e.g. instrument your app, mail it to your test site in India, have them run it on their server for some days where it will generate timing reports all on it's own, and have them mail back those reports to you, which you can then view in the profiler. Pretty neat.
Note: To have the profiled app generate timing snapshots "on it's own" it must know when to generate them. By default this is when the method Application_End is called in an ASP.NET app. You can programmatically dump snapshots when it suits you by using the EQATEC Profiler API. See the user guide or check out this thread.
You can read about this on Microsoft Developer Network.
You can select documentation according to the version of your Visual Studio. You should verify profiling functionality is provided for your Visual Studio type.
How to: Profile a Web Site or Web Application Using the Performance Wizard
Your best bet is to profile your code on your own machine to identify where it is spending time.
Grab a ten day free trial of this:
http://www.jetbrains.com/profiler/
Here are some links to get you going:
Link
http://msdn.microsoft.com/en-us/library/ms178643(v=VS.100).aspx
http://www.codeproject.com/KB/aspnet/10ASPNetPerformance.aspx
I have a website that's running on a Windows server and I'd like to add some scheduled background tasks that perform various duties. For example, the client would like users to receive emails that summarize recent activity on the site.
If sending out emails was the only task that needed to be performed, I would probably just set up a scheduled task that ran a script to send out those emails. However, for this particular site, the client would like a variety of different scheduled tasks to take place, some of them always running and some of them only running if certain conditions are met. Right now, they've given me an initial set of things they'd like to see implemented, but I know that in the future there will be more.
What I am wondering is if there's a simple solution for Windows that would allow me to define the tasks that needed to be run and then have one scheduled task that ran daily and executed each of the scheduled tasks that had been defined. Is a batch file the easiest way to do this, or is there some other solution that I could use?
To keep life simple, I would avoid building one big monolithic exe and break the work to do into individual tasks and have a Windows scheduled task for each one. That way you can maintain the codebase more easily and change functionality at a more granular level.
You could, later down the line, build a windows service that dynamically loads plugins for each different task based on a schedule. This may be more re-usable for future projects.
But to be honest if you're on a deadline I'd apply the KISS principle and go with a scheduled task per task.
I would go with a Windows Service right out of the gates. This is going to be the most extensible method for your requirements, creating the service isn't going to add much to your development time, and it will probably save you time not too far down the road.
We use Windows Scheduler Service which launches small console application that just passes parameters to the Web Service.
For example, if user have scheduled reports #388 and #88, scheduled task is created with command line looking like this:
c:\launcher\app.exe report:388 report:88
When scheduler fires, this app just executes web method on web service, for example, InternalService.SendReport(int id).
Usually you already have all required business logic available in your Web application. This approach allows to use it with minimal efforts, so there is no need to create any complex .exe or windows service with pluggable modules, etc.
The problem with doing the operations from the scheduled EXE, rather than from inside a web page, is that the operations may benefit from, or even outright require, resources that the web page would have -- IIS cache and an ORM cache are two things that come to mind. In the case of ORM, making database changes outside the web app context may even be fatal. My preference is to schedule curl.exe to request the web page from localhost.
Use the Windows Scheduled Tasks or create a Windows Service that does the scheduling itself.
Jeff has previously blogged about using the cache to perform "out of band" processing on his websites, however I was wondering what other techniques people are using to process these sorts of tasks?
Years ago, I saw Rob Howard describe a way to use an HttpModule to process tasks in the background. It doesn't seem as slick as using the Cache, but it might be better for certain circumstances.
This blog post has the details, and there are many others that capture the same information if you look around.
Windows Service
You may want to look at how DotNetNuke does it. I know it is written in VB.NET, but I retrofitted the code into C#. I was perusing the source and noticed they had a feature in their admin area to setup scheduled tasks. These tasks get setup thru the admin interface and stored in the database. When the site starts, thru the Global.asax file, they either created another thread to run this service that then runs the scheduled tasks at their scheduled time. I can't remember the exact logic, it's been a while, but it is definitely a good resource on how other people have done out of band processes for Asp.Net applications. This technique still keeps the logic within the Asp.Net application, but it runs out of band in my opinion.
if it's primarily data processing tasks and you're using MSSQL, how about scheduled SSIS tasks?
Scheduled tasks using http://www.codeproject.com/KB/cs/tsnewlib.aspx or schtasks.exe.
Quartz.NET
MSMQ
SQL Server jobs
Windows service
System.Threading.Timer or System.Timers.Timer
System.ComponentModel.BackgroundWorker
Asynchronous calls and callbacks
Scheduled tasks, or cron jobs.
The problem with scheduled tasks or cron jobs is that they don't share memory space with the web server. You could set up a scheduled task that requested pages from the web server, but that might create problems with long running tasks. It would be nice to have some low priority threads running on the actual ASP.Net application stack to do simple utility tasks like cleaning up caches, monitoring resources, and just to deal with general housekeeping.
Simple queue files along with a separate agent. For each type of out of band process write a separate agent .exe which watches a directory for queue files that include whatever data is needed to perform the specified process.
This may seem dirty but in the real world I find it gives a lot of flexibility, you aren't doing a lot of processing in ASP.net process space and you could easily adapt this style to farm processing out to cheap Linux servers running the agent process on Mono for when you start needing more RAM/CPU/disk.
If you are most comfortable with asp.net pages you can write a small app to handle your job and then "ping" the app with an outside service that monitors your web site. This will keep the app alive.