We have a situation where there are two XAML workflows (WF1 and WF2). WF1 is a superset of WF2. Thus while WF2 executes independently of WF1 within the system, we want WF2 to execute within the scope of WF1 when WF1 executes. Basically towards the end of WF1 there is a bookmark, on resumption of that bookmark WF2 has to execute.
They both operate on shared resources and we can not have different instances of these WFs running at the same time on our shared resources. We have a queuing mechanism in place that takes care of this.
What I would like to do is to somehow have this execution built in WF1 at design time. So maybe code an activity that loads up WF2 or convert WF2 to a coded activity somehow and drop it in WF1 etc. What I don't want is to have to pretty much copy WF2 and drop it in WF1 designer. Also we don't want a separate host process to be started within WF1 to execute WF2.
Basically in our WF1 and WF2 are both ran under WorkflowServiceHost. They are not traditional service WFs (so no Send/Receive activities) but are normal WFs exposed as WCF services. We have a good deal of customization that has gone into our WorkflowServiceHost regarding its persistence, tracking etc. which both our WFs benefit from. I would like to ensure that whatever mechanism we take to launch WF2 within WF1 we don't lose these benefits.
PS: If you would like to see how we have customized the running of these WFs, you can download sample code from my blog here
Technically speaking a workflow is just an activity. So if you compile the project with WF1 and WF2 in it and you open WF1 you will see WF2 in the activity toolbox. Just drag it onto WF1, wire up any arguments and you are good to go.
Related
I have web app code in VS2010 that is manually executed every night. One of the developers manually runs the code in VS, when the web page opens, presses a few buttons etc.. and executes the code to get our required results. How can we automate this process so as to eliminate any human element. Ideally, I am looking for a way to have the code execute automatically at a given time during the day. What is involved in getting something like this to work?
A WCF service is a possible solution:
http://msdn.microsoft.com/en-us/library/ms734712.aspx
Windows has scheduled tasks which is good at... scheduling tasks.
Do you (or plan to) have a big test of GUI tests? There are entire tools dedicated to GUI automation testing. I'd recommend looking into one of those if this is going to be a big part of your overall test strategy.
If this is a one-time thing, you could schedule the running of a simple C# application that hosts a Web Browser control, that points to your web site. In this sense, the Web Browser control acts as your browser. You can send JavaScript commands to it etc.
Web Browser documentation:
http://msdn.microsoft.com/en-us/library/system.windows.forms.webbrowser.aspx
Another alternative is to schedule an AutoHotKey script to simulate the key pressing and mouse clicks. This works well. I used to use it to rack up Farmville points (no joke).
http://www.autohotkey.com/
A Windows service seems better in the sense of "scheduled" and "no intervention." A WCF service is still, by default, listening and waiting for interaction (hosting the service). Or just schedule a simple app or script to do what you need.
Can you tell us a little more about this process? My initial recommendation would be to transition that code from a web application into a command line utility and then use windows task scheduler. If for some reason there are heavy dependencies within that web app that make that impossible I would consider taking those button click events and turning them into web services that you could then call programmatically from a command line application.
we use Windows Workflow Foundation internally on a large data manipulation and loading operation. Some of the workflows are complicated and take, with their sub-workflows - significant time, up to a total runtime for the main workflow of arounr 3-4 hours at the moment. THis is expected.... but it would be nice to be able to see where a workflow is.
We have an ASP.NET front end for the operational users. We would lvoe for them to be able to open a page showing a specific running workflow instance in a visualization with basically presenting a way to see the status of the activities (i.e. which activity executed, which is currently executing). No editing is required here . the idea is purely one of not having a black box 3 hour run. We are writing log entries internally which can be seen, but as always, a picture says more than a thousand words, and a visual presentation would definitly be better for them.
Anyone knows of any suite of components for asp.net (preferably mvc) that can visualize a server side running workflow? Editing is NOT required.
This is perhaps not exactly what you are looking for...
I am not aware of any ready-made components but perhaps you can built one for your self using WF designer. See this sample that hosts workflow designer on server side, capture the image and shows it on the page. It uses custom tracking services to depict which activity has run. Unfortunately, its a dated article and I am not sure if similar approach can work in .NET 4 (AFAIK, workflow designer has changed in .NET 4).
In case designer can not be hosted then you may try painting workflow activities/nodes by custom code. You have to use tracking to see the workflow state.
We need the ability to send out automatic emails when certain dates occur or when some business conditions are met. We are setting up this system to work with an existing ASP.NET website. I've had a chat with one of the other devs here and had a discussion of some of the issues.
Things to note:
All the information we need is already modelled in the ASP.NET website
There is some business-logic that is required for the email generation which is also in the website already
We decided that the ideal solution was to have a separate executable that is scheduled to run overnight and do the processing and emailing. This solution has 2 main problems:
If the website was updated (business logic or model) but the executable was accidentally missed then the executable could stop sending emails, or worse, be sending them based on outdated logic.
We are hoping to use something like this to use UserControls to template the emails, which I don't believe is possible outside of an ASP.NET website
The first problem could have been avoided with build and deployment scripts (which we're looking into at the moment anyway), but I don't think we can get around the second problem.
So the solution we decided on is to have an ASP.NET page that is called regularly by SSIS and to have that do a set amount of processing (say 30 seconds) and then return. I know an ASP.NET page is not the ideal place to be doing this kind of processing but this seems to best meet our requirements. We considered spawning a new thread (not from the worker pool) to do the processing but decided that if we did that we couldn't use the page returned to signify a success or failure. By processing within the page's life-cycle we can use the page content to give an indication of how the processing went.
So the question is:
Are there any technical problems we might have with this set-up?
Obviously if you have tried something like this any reports of success/failure will be appreciated. As will suggestions of alternative set-ups.
Cheers,
Don't use the asp.net thread to do this. If the site is generating some information that you need in order to create or trigger the email-send then have the site write some information to a file or database.
Create a Windows service or scheduled process that collects the information it needs from that file or db and run the email sending process on a completely seperate process/thread.
What you want to avoid is crashing your site or crashing your emailer due to limitations within the process handler. Based on your use of the word "bulk" in the question title, the two need to be independent of each other.
I think you should be fine. We use the similar approach in our company for several years and don’t get a lot of problems. Sometimes it takes over an hour to finish the process. Recently we moved the second thread (as you said) to a separate server.
Having the emailer and the website coupled together can work, but it isn't really a good design and will be more maintenance for you in the long run. You can get around the problems you state by doing a few things.
Move the common business logic to a web service or common library. Both your website and your executable/WCF service can consume it, and it centralizes the logic. If you're copying and pasting code, you know there's something wrong ;)
If you need a template mailer, it is possible to invoke ASP.Net classes to create pages for you dynamically (see the BuildManager class, and blog posts like this one. If the mailer doesn't rely on Page events (which it doesn't seem to), there shouldn't be any problem for your executable to load a Page class from your website assembly, build it dynamically, and fill in the content.
This obviously represents a significant amount of work, but would lead to a more scalable solution for you.
Sounds like you should be creating a worker thread to do that job.
Maybe you should look at something like https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
You can and should build your message body (templated message body) within domain logic (it means your asp.net application) when some business conditions are met and send it to external service which should only send your messages. All messages will have proper informations.
For "when certain dates occur" scenario you can use simple solution for background tasks (look at Craig answer) and do the same as above: parse template, build message and fast send to specified service.
Of course you should do this safe then app pool restarts does not breaks your tasks.
I'm not sure if this is technically a web service or not but I have a Flash file that periodically needs to make a round trip to a DB. As it stands, AS3 uses the URLLoader class to exchange XML with an ASP.NET/VB file on the server. The aspx code then goes to the DB and returns whatever information is requested back to the Flash file.
As my program grows and I need to execute a larger variety of tasks on the server, I'm wondering if I should just keep placing functions in that same aspx file and specify in AS3 which function I should load for any given task. OR, is it better to break up my functionality into several different aspx files and call the appropriate file for the task?
Are there any obvious pros and cons to either method that I should consider?
(note: I have put all of VB functions on the aspx pages rather than the code behind files because I was having trouble accessing the i/o stream from the code behind.)
Thanks.
T
When you are saying you need to execute a large variety of tasks you should think about breaking the code down into multiple files. Though this question cannot be answered in general and the solution is always specific to the problem this might help:
Reasons to keep all code in one file on the server side:
Code for different tasks heavily depends on each other
Effort for separating the tasks into files is too high
The variety/count of different tasks is manageable
You are the only developer
Every tasks works correct and is fail safe (Since they are all in one file I assume one error will break all tasks)
Reasons to separate tasks into different files:
The file is getting too big, unreadable and unmaintainable
Different tasks should not depend on each other (Separation of concerns)
There are multiple developers working on different tasks
Many new tasks will be added
A task could contain errors and should not break every other task
That is all I can think of right now. You will sure find more reasons for yourself. As said I would separate the tasks as I think the effort is not too high.
I have a website that's running on a Windows server and I'd like to add some scheduled background tasks that perform various duties. For example, the client would like users to receive emails that summarize recent activity on the site.
If sending out emails was the only task that needed to be performed, I would probably just set up a scheduled task that ran a script to send out those emails. However, for this particular site, the client would like a variety of different scheduled tasks to take place, some of them always running and some of them only running if certain conditions are met. Right now, they've given me an initial set of things they'd like to see implemented, but I know that in the future there will be more.
What I am wondering is if there's a simple solution for Windows that would allow me to define the tasks that needed to be run and then have one scheduled task that ran daily and executed each of the scheduled tasks that had been defined. Is a batch file the easiest way to do this, or is there some other solution that I could use?
To keep life simple, I would avoid building one big monolithic exe and break the work to do into individual tasks and have a Windows scheduled task for each one. That way you can maintain the codebase more easily and change functionality at a more granular level.
You could, later down the line, build a windows service that dynamically loads plugins for each different task based on a schedule. This may be more re-usable for future projects.
But to be honest if you're on a deadline I'd apply the KISS principle and go with a scheduled task per task.
I would go with a Windows Service right out of the gates. This is going to be the most extensible method for your requirements, creating the service isn't going to add much to your development time, and it will probably save you time not too far down the road.
We use Windows Scheduler Service which launches small console application that just passes parameters to the Web Service.
For example, if user have scheduled reports #388 and #88, scheduled task is created with command line looking like this:
c:\launcher\app.exe report:388 report:88
When scheduler fires, this app just executes web method on web service, for example, InternalService.SendReport(int id).
Usually you already have all required business logic available in your Web application. This approach allows to use it with minimal efforts, so there is no need to create any complex .exe or windows service with pluggable modules, etc.
The problem with doing the operations from the scheduled EXE, rather than from inside a web page, is that the operations may benefit from, or even outright require, resources that the web page would have -- IIS cache and an ORM cache are two things that come to mind. In the case of ORM, making database changes outside the web app context may even be fatal. My preference is to schedule curl.exe to request the web page from localhost.
Use the Windows Scheduled Tasks or create a Windows Service that does the scheduling itself.