I am running the Quartz.Net server as a Windows service, like described in the documentation. I am trying to understand how I can create new jobs for Quartz to schedule, without the need to rebuild the Quaretz.net server application everytime.
I would like to be able to add new jobs from an exe, dll, or other options welcome. This way I can add jobs dynamically. From what I can tell it seems all jobs must be defined up front and built into the server. From there the user can pass parameters and enable triggers via XML file. I am using MS SQL Server instead of XML file for persistence layer.
My use case is I need to generate reports at particular times, but the users can create new reports after launch of my application. I am using Dev Express for my reporting (not sure if this matters).
Any guidance is very appreciated.
You should check out the work Tolis Bekiaris did on the eXpand Framework's JobScheduler. It's a module for DevExpress's XAF and Quartz.NET which should give you plenty of sample code, especially if you are already using XPO for your data.
You can get the source code here.
Or alternatively, it's on Github.
You'll find the job scheduler code in eXpand/Xpand/Xpand.ExpressApp.Modules/JobScheduler.
Related
I am in the process of converting our legacy custom database deployment process with custom built tools into a full fledged SSDT project. So far everything has gone very well. I have composite projects that can deploy a base database as well as projects that deploy sample and test data.
The problem I am having now is finding a solution for running some sort of code that can call a web service to get an activation code and add it to the database as the final step of the process. Can anyone point me to a hook that I might be able to use?
UPDATE: To be clearer I am doing this to make it easier to maintain and deploy our sample and test data to a local machine. We can easily use Jenkins to activate the sites when they are deployed nightly to our official testing environments. I'm just hoping to be able to do this in a single step to replace the homegrown database deploy tool that we use now.
In my deployment scenario I wrapped database deployment process in some powershell scripts which do necessary prerequisites. For example:
powershell script is started and then it stops some services
next it run sqlpackage.exe or preproduced sql deployment scripts
finally powershell script starts services.
You can pass some parameters from powershell to sql scripts or sqlpackage.exe as sqlcmd variables. So you can call webservice first, then pass activation code as sqlcmd variable and use the variable in postdeployment script.
Particularly if it's the final step, I'd be tempted to do this separately, using whatever tool you're using to do the deployment: Powershell scripts, msbuild, TFS, Jenkins, whatever. Presumably there's also a front-end of some description that gets provisioned this way?
SSDT isn't an eierlegende Wollmilchsau, it's a set of tools for managing database changes.
I suspect if the final step were "provision a Google App Engine Instance and deploy a Python script", for example, it wouldn't appear to be a natural candidate for inclusion in an SSDT post-deploy script, and I reckon this falls into the same category.
I've got a state machine workflow implemented using WorkflowFoundation 4.1. I'm using the SQL Server persistence store and the WorkflowApplication class for loading and running workflows.
I'm making changes to the state machine workflow model, and am finding that existing instances break very easily. I've written code that can replay the workflow back into the correct state, which is basically a migration, which works fine, however I need to be able to clear out the old workflow instance as well.
The main issue is that if the workflow is invalid, I can't even load it, so I can't terminate or cancel it either.
Is there a way to use the workflow API to remove a workflow without loading it (ie, some command on the SqlPersistenceStore), or do I have to clean the database manually?
The SqlWorkflowInstanceStore doesn't allow you to do so directly. You will need to go into the database and delete the record there. If you are using AppFabric there is actually a command to delete a workflow instance without loading it first for just that purpose. There should be a PowerShell command to do that using code.
How can I automate the web-application build process, that includes following steps:
Change connection string.
Recreate database by scripts.
Deploy web-site by ftp.
Copy some files to server in addition to application.
And may be perform some initialize operations.
Should I write any script/programm, use Visual Studio or any another program?
Personally I use a Continuous Integration tool to do this kind of work.
The one I mainly use is Team City by JetBrains.
This kind of software can look at your Source Control repo for new checking, perform builds, publish builds to servers as well as running pre/post build events.
You've to start learning MSBuild. It is VERY simple and straightforward, so just start and you'll see ;)
In adddition to built in features it has Community Pack with many tasty things so you will be able to:
Replace connection string in config file using regex or replace whole config with predefined connection string (FileUpdate or Copy task)
Execute database scripts (MSBuild.Community.Tasks.SqlServer.ExecuteDDL)
Deploy site using Copy task
And many other...
You can run pre and post events in Visual Studio. To do this, simply right click on the project and in the project properties navigate to the 'Build Events' options. Here you can specify the pre and post build events (you can also specify when the event runs - on successful build or otherwise).
Once the project has been successfully built, the post build event can be set up to perform the tasks specified. You can detail the steps either in a separate file or in Visual Studio project's build events itself.
More information
Pre/Post Build event command line arguments
How to: Specify Build Events (C#)
Much along the continuous integration concept Jamie mentions, we use BuildMaster internally for all of our applications since we develop it :)
Now that we have a version offered for free, I'll share some thoughts on each of your bullet points:
Change connection string
This is something that is handled uniquely by the tool. Each environment would get its own "instance" of a configuration file and in a deployment plan you can use the "deploy configuration files" action to put them in any environment. This means there are no transforms to worry about since the config file is stored and versioned within the tool.
Recreate database by scripts
This is another major feature we have. Object code (stored procs, views, etc.) can be run every time with a DROP/CREATE combo, but adding indexes, dropping columns, can only be done once (you can't bring a column's data back without a restore!)
BuildMaster handles these types of change scripts differently - they can only be run at most once against an environment's instance of your database. This makes it super easy to bring any new or existing initialized database schema up-to-date.
Deploy web-site by FTP
Just add an action to your deployment plan, and you click Create Build or Promote Build, it will do that.
Copy some files to server in addition to application
If the process is repeatable you can do this easily, if need be by using a manual action that will remind you to do it.
And may be perform some initialize operations
This sounds like a "change control" to me, a one-time change when you release. We support these as well but not in the free version unfortunately.
I am trying to create a setup procedure which installs my entire web application. I am using Visual Studio's Setup and Deployment Project. So far I've gotten it to deploy my website to the Inetpub folder, and I've also added some custom actions which allow it to run some SQL and setup my database.
The last thing I have to integrate into the setup process is my two SSIS packages. Not only do these need to be installed, they also need to be scheduled to run nightly.
The packages are simple, and don't reference anything unusual. They are just 2 .dtsx files.
So far, I've seen that I can use something called dtutil in order to create a dtsinstall.exe file which can be run to install the SSIS packages to either the file system or the database.
First of all, is this the easiest way to do it? And secondly, how would I go about scheduling the packages to be run nightly?
One caveat is that I need this to install silently, without prompting the user for any input.
First, create a new job in SQL Agent. In the steps panel, click New, give it a name, and for the job type, select SQL Server Integration Services Package. For the Package source, select File system, and point it to where you want the file to live.
Then you can select Schedules from the left panel, and configure how often you want it to run.
After that, you should be able to handle deployment of the SSIS package by copying the .dtsx file to the location you specified when you created the job.
The site I'm working on is running Windows Server 2003 and SQL Server 8 (2000?), and ASP.NET 3.5.
I need to have some sort of script or application run to import data from an FTP'd text file, into the database. There is already a site running on the machine, that uses the current database. Can I use a scheduled task to reliably kick off some sort of .aspx page that will import the data? Or is there a better approach?
What about making sure that no one else can access the page that runs the import? I don't want random users running the import!
Thanks in advance!
P.S. some processing needs to occur on the data before its inserted. i.e. lookups, conditionals, etc, so the DB tools aren't robust enough (I think). I hate DTS, and I SSIS is not available in this version I think.
If you want to have a C# App handle your import I would suggest a windows application (exe) w/o a form (better than a console app because it does not pop up any UI whenever it runs). Have it run every so often (every minute) by a scheduled task.
Why would you use ASP.NET? Depending on the complexity of the job you could either load it directly to the database (BULK LOAD) or use DTS (SQL Server 2000) or SSIS (SQL Server 2005/2008) if more complex processing is needed.
DTS and stored procedures in a job.
BCP and stored procedures in a job.
You say you need to do alot of lookups and conversions? SQL is good at that - and good at doing it fast. It can seem a little intimidating at first, but it's not hard.
run a BULK INSERT or bcp to import the data instead, see here http://msdn.microsoft.com/en-us/library/aa173839(SQL.80).aspx
I'll echo other people here - you don't want to have a scheduled task hit a web page. SQL Server provides some good data import options, or you could just write a simple windows program and run it as a scheduled task.
Another option would be to write a windows service that watches your FTP directory and does the import.
As others have said, probably a separate console application (triggered by a scheduled task) or a windows service would be the best option for this scenario.
On the other hand, if you already have all the required functionality available in the web app running on the server, then you could probably set up a scheduled task, that starts a script (VBscript, JScript), which in turn calls a page of the web app.
To have some sort of security (e.g. preventing that any user can call that page), you could add some code to the page, that checks if the page was called with http://localhost. This would at least prevent the page from being called from a remote client.