Is there a way to use an activity to stop the execution of its workflow?
I have multiple TryCatch activities and If activities and it would be nice to be able to stop the workflow after catching an exception or if certain criteria aren't met in my If activities.
You can use the TerminateWorkflow activity to stop a workflow.
Just a word of warning for anyone else who might run into this, TerminateWorkflow won't actually terminate your workflow if you put it inside a CompositeActivity (it will only terminate the composite.)
Related
As part of our regulatory requirements, we would like to capture the task reassignment in the workflow history in the workflow summary page.
To achieve this, when the task is getting reassigned, I'm completing current task as system, setting the outcome as Task Reassigned and the task will be assigned to the new person.
This approach works fine for simple review and approve workflow (ie) bpm_assignee (single user).
I couldn't find the correct approach for parallel review workflow types, (ie) if the task the assigned to multiple users (bpm_assignees).
Can you please suggest me, how can I capture the task reassigned for parallel review approval workflow ?
Alfresco Version : 5.0.2.5 Enterprise Edition.
Here is the workflow structure on how we implemented task reassignment for single reviewer and multiple reviewers.
Moreover we have created our own workflow pages(start,task-edit,task-details and workflow summary) to render the values. But theoretically it should display the workflow values properly in the OOTB pages.
To support multiple reviewer, you need to use sub-process either with Sequential / parallel. To achieve this you need to configure, Loop Cardinality, Collection, Element Variable, Sequential and Completion condition values.
Please let me know, if you need any help on this.
Happy coding
I was trying to create a dag which has only one task. Can I mark the task with required status like skipped or no status?
Requirement: Generally I will be looking at s3 bucket for every one minute and if files available I will do some processing. otherwise, I will just leave. I want to see this visibility in UI. so. I was trying to mark task status as skipped so that.
is this right way to do ? do we have any other way to achieve this
Thanks
If you're wanting to mark a task as skipped, you can raise an AirflowSkipException. When raised, the execution of the task will stop and the task will get marked as skipped.
This example Airflow dag of a DummySkipOperator demonstrates an operator which gets marked as skiped by raising the above exception.
If I have a process looking similar to this, where there is a Approve usertask and a multiinstance parallel Review usertask. The business rule is whenever the Approver approves, then even if there are more reviewers available to review the (multi)task it should cancel all the remaining task instances. (Ex: <completionCondition>${approved == true}</completionCondition>). How should I implement this scenario? Thanks.
You could add a signal boundary event on the Multi instance Review user task. After the approve user task you can add an intermediate signal throw event that triggers the signal boundary event. In that way the Review multi instance user task will be terminated when the Approve user task is completed.
One word of warning when using the signal approach (which is IMO the right answer).
But, notice in the below image I am splitting the flow with a parallel gateway. If I simply use a parallel join, the process instance will never complete because the parallel join never gets all the tokens it expects. You should use an inclusive join (as shown below) which will recalculate the number of expected tokens and allow flow through to the "Done" task.
I have a box job that is dependent on another job finishing. The first job normally finishes by 11pm and my box job then kicks off and finishes in about 15 minutes. Occasionally, however, the job may not finish until much later. If it finishes later than 4am, I'd like to have it send an alert.
My admin told me that since it is dependent on a prior job, and not set to start at a specific time, it is not possible to set a time-based alert. Is this true? Does anybody have a workaround they can suggest? I'd rather not set the alert on the prior job (suggested by my admin) as that may not always catch those instances when my job runs longer.
Thanks!
You can set a max run alarm time which will alert if that time is exceeded
We ended up adding a job to the box with a start time of 4am that checks for the existence of the files the rest of the job creates. We also did this for the jobs predecessors to make sure we are notified if we are at risk of not finishing by 4am.
I have successfuly saved the state machine and applied bookmarks to state machine after loading them for mutiple times.
But what happens when they reach to a final state ?
Why they are removed from persitance data store ([System.Activities.DurableInstancing].[InstancesTable]) after geting finished?
Is that normal or am I makeing a mistake in persisting finished statemachines ?
Workflow is code. You define the logic using larger pieces, but it executes and returns a result. It is not the result itself.
Imagine you had a class that had methods you call that determines approval/denial. You would spin up that class, pass in argument values, and let the code execution determine approval/denial. What do you do after this code executes?
You wouldn't store the code of that method, that's for sure. You would store who approved, who denied, and the final result.
So you shouldn't be storing the code of the workflow but the results.
I would accomplish the goals of this workflow by creating custom Activities extending NativeActivity, using one or more workflow extensions to communicate with the outside world to send notifications about approval or denials waiting for action. Along the way I'd record who did what when my bookmarks resume execution. When the workflow completes, I record the final result as well.