Is there a way to add SqlTrackingService to WorkflowApplication in WF4? - workflow-foundation-4

I've been searching for a way to add the SqlTrackingService to the WorkflowApplication in WF4 similar to the way it can be done in WF3 as follows:
WorkflowRuntime wr = new WorkflowRuntime();
SqlTrackingService ts = new SqlTrackingService("Initial Catalog=Tracking;Data
Source=localhost;Integrated Security=SSPI;");
ts.UseDefaultProfile = true;
wr.AddService(ts);
wr.StartRuntime();
Thanks in advance!

WF4 has the concept of TrackingParticipant. You can attach tracking participants to your workflow execution and it will receive its events.
You can implement a SQLTrackingParticipant. Check these WF examples where you've available an example of an custom SQLTrackingParticipant on \WF\Basic\Tracking\SqlTracking folder.

Related

Axon Framework Execute EventUpcaster

I need to upgrade an event to a new version. This entails removing an old property that is growing (due to a design decision) and is not longer needed.
Having read the documentation available at https://docs.axoniq.io/reference-guide/axon-framework/events/event-versioning and implementing the required parts
#Bean
public EventUpcasterChain eventUpcasterChain(){
return new EventUpcasterChain(
new CurrentLoanBalanceAcceptedEventUpcaster(),
new InterestCalculatedEventUpcaster()
);
}
I cannot seem to get the upcasters to fire the "doUpcast" method. My understanding is that on start up the application will find all old events and convert them to the new version of the events. Is this correct? Or does it only upcast the old events when they are replayed? I am hoping there is a way to do this without replaying all the events.
Versions:
Spring Boot: 2.4.11
Axon Framework: 4.5.4
Update:
After trying to figure out how to remove nodes from the event xml using the upcasters I failed miserably. I must point out that this issue is not an Axon issue but was a poor design decision and a lack of understanding what Axon was doing with the events. The primary lesson is to keep events as simple as possible.
I managed to unload the unnecessary xml from the domain event store using the following query:
update domainevententry
set payload = lo_from_bytea(0, decode(regexp_replace(
subquery.output,
'\<nodeToReplace\>(.*)\<\/nodeToReplace\>',
''
), 'escape'))
from (
SELECT eventidentifier, payloadtype, encode(lo_get(payload::oid), 'escape') as output
FROM domainevententry
WHERE eventidentifier in (
'<id>'
)
AND payloadtype = '<payloadType>'
) as subquery
where domainevententry.eventidentifier = subquery.eventidentifier;
every upcaster in the upcaster chain is called only when it finds an 'old' event to convert to the new one. New events that are fired are the new version of it, so the upcaster will not be used for that.
Upcasters are used when:
An aggregate is loaded (without snapshot) and an old event is encountered
A projection is replayed and an old event is encountered
In all other cases, it will be the new event, so the upcaster won't have to be fired. The upcaster is there so aggregates can keep being loaded, and projections can be replayed from the beginning of the event store.
If this is not the case, we need to take a look at the Revision parameter on the old event definition and the new event definition. For example, if the old event had no #Revision annotation, the SimpleSerializedType needs a version of null or it won't match.
Please include the code of the upcasters if we need to dig further, that would help greatly!

Automatic activity not performing

Created a workflow with basic as below.
Created a calss library, used ProgId, set comvisible true and registerd the assembly in the Tridion server.
This is the way i have tested:
Created a component
Finished the activity from the work list.
Navigated to the "Global Work list" and finished the Reviewer activity by myself by choosing the "Back to Author" step and clicked the "Finish" button.
The item is not moved to the author. but when i finish the activity again from the global work list, the item moved to author.
It seems that my code is not performing the activity because i tried removed the below VB script code and tried with the default automatic script code.
' Script for Automatic Activity Content Manager Workflow
FinishActivity "Automatic Activity Finished"
It behaves the same as above. so i decided my code is not worked. Can any one please help on this?
Below is the VBScript I used in the script box of "Back to Author":
Option Explicit
Dim workflowHandler
Set workflowHandler = CreateObject("CoreComponentWorkflow.WorkflowHandler");
If Not workflowHandler Is Nothing Then
Call workflowHandler.MoveBackToActivity(Cstr(CurrentWorkItem.ID, "Create or Edit Component")
End If
Set workflowHandler = Nothing
Below is the C# Code:
public void MoveBackToActivity(string workitemid, string strActivitytoMove)
{
try
{
Session session = new Session();
WorkItem workitem = new WorkItem(new TcmUri("workitemid"), session);
ActivityInstance currentactivity = workitem.Activity as ActivityInstance;
ProcessInstance procInstance = currentactivity.Process as ProcessInstance;
IEnumerable<ActivityInstance> ieActivities = procInstance.Activities
.Select (s => s)
.Where (w => w.Title.IndexOf(strActivitytoMove) !=-1)
.OrderByDescending(w =>w.StartDate);
if (ieActivities != null && ieActivities.Count<ActivityInstance>() > 0)
{
ActivityInstance targetactivity = ieActivities.ElementAt(0);
User lastperformuser = targetactivity.Performers.ElementAt(targetactivity.Performers.Count<User>() - 1);
ActivityFinish finish = new ActivityFinish(targetactivity.FinishMessage, lastperformuser, workitem.Session);
currentactivity.Finish(finish);
}
}
catch (Exception ex)
{
throw ex;
}
}
Be aware that you are using an API that is NOT supported in Automatic Activities. The only processes where you are allowed to use TOM.NET are Event System handlers and Template Building Blocks as documented here.
Automatic Workflow Activities - if not developed with VBScript - must use the CoreService interface.
The good news is that I know for a fact this works - plenty of people got it to work in many implementations. The bad news (for you) is that the error is in your code. Have you tried debugging/step-by-step through your code yet? You can attach to to the workflow process (cm_wf_svc.exe) and figure out what's wrong with the code much faster than we can.
Here's a really simple snippet to finish an activity with CoreService:
ActivityFinishData activityFinish = new ActivityFinishData
{
Message = "Automatically Finished from Expiration Workflow Extension"
};
ActivityInstanceData activityInstance =
(ActivityInstanceData)processInstance.Activities[0];
client.FinishActivity(activityInstance.Id, activityFinish, readOptions);
BTW - If you intended to use TOM.NET anyway, why did you bother asking which API to use?
Following the Nuno's answer, yes you should change the code to use TOM or Core Services. TOM .Net is not supported because it is using a different thread apartment than the underlying technology we use for workflow (COM).
About the issue I have checked that you are calling the activity like this.
Call workflowHandler.MoveBackToActivity(Cstr(CurrentWorkItem.ID, "Create or Edit Component")
It looks like the activity name is not matching. there are some strange characters between "Edit" and "Component"
I hope this helps.
Automatic activities are executed by the Workflow agent service. An Assigned state may indicate that it's just not being picked up by the service. Is your service running correctly, and are things like queue notifications set up properly?

Determining Workflow Arguments at Runtime Prior to Execution

Is there a way to determine the arguments to a workflow prior to executing it?
I've developed an application that rehosts the designer, so end users can develop their own workflows. In doing this, a user is able to add their own arguments to the workflow.
I'm looking for a way to inspect the workflow prior to execution, and try to resolve the arguments. I've looked at the WorkflowInspectionServices class, but I can't seem to ask for a particular type of item from it.
Ideally, I'd like to construct a workflow from metadata stored in the database using something like:
var workflow = ActivityXamlServices.Load(new XamlReader(new StringReader(xamlText)));
var metadata = SomeUnknownMagicClass.Inspect(workflow);
var inputs = new Dictionary<string, object>()
forreach(var argument in metadata.Arguments)
{
inputs.Add(argument.Name, MagicArgumentResolver.Resolve(argument.Name));
}
WorflowInvoker.Invoke(workflow, inputs);
I might be missing something, but WorkflowInspectionServices doesn't seem to do this. It has the method CacheMetadata which sounds promising when you read the MSDN docs, but basically turns up with nothing.
Thanks for any help.
I guess that when you talk about metadata stored in the database you're referring to the XAML from the designer.
You can load that XAML as a DynamicActivity like this:
using (var reader = new StringReader(xamlString))
{
var dynActivity =
ActivityXamlServices.Load(reader) as DynamicActivity;
}
Then you've access to all its arguments through DynamicActivity.Properties.

Threading for binding multiple gridview

I am binding 4 gridviews on button click. Like this
gv1.DataSource = GetData("Mill");
gv1.DataBind();
gv2.DataSource = GetData("Factory");
gv2.DataBind();
gv3.DataSource = GetData("Garage");
gv3.DataBind();
gv4.DataSource = GetData("Master");
gv4.DataBind();
They all are using the same method for getting the result and they are also taking time to load. Is there any way I can run them parallel? I afraid, because they are using same method to get the data.
Is it possible to do threading for them. How?
Please help
You may take a look at the following article about asynchronous pages.

Quartz.NET trigger not firing

i am using Quartz.NET in my ASP.NET web application. i put the following code in a button click handler to make sure that it executes (for testing purposes):
Quartz.ISchedulerFactory factory = new Quartz.Impl.StdSchedulerFactory();
Quartz.IScheduler scheduler = factory.GetScheduler();
Quartz.JobDetail job = new Quartz.JobDetail("job", null, typeof(BackupJob));
Quartz.Trigger trigger = Quartz.TriggerUtils.MakeDailyTrigger(8, 30); // i edit this each time before compilation (for testing purposes)
trigger.StartTimeUtc = Quartz.TriggerUtils.GetEvenSecondDate(DateTime.UtcNow);
trigger.Name = "trigger";
scheduler.ScheduleJob(job, trigger);
scheduler.Start();
here's "BackupJob":
public class BackupJob : IJob
{
public BackupJob()
{
}
public void Execute(JobExecutionContext context)
{
NSG.BackupJobStart();
}
}
my question: why is "BackupJobStart()" not firing? i've used similar code before and it worked fine.
EDIT: #Andy White, i would have it in Application_Start in Global.asax. this doesn't work which is why i moved it to a button click handler to narrow down the problem.
Do you have the Quartz.NET logging hooked up? I once had a problem with a job not executing (I forget why), but once I got the Quartz.NET logging going, the problem was obvious.
It's worth a try (if you're not already doing it):
https://www.quartz-scheduler.net/documentation/quartz-2.x/quick-start.html
http://netcommon.sourceforge.net/
http://netcommon.sourceforge.net/documentation.html
Update: Simply add this to your program.cs to enable console logging:
Common.Logging.LogManager.Adapter = new Common.Logging.Simple.ConsoleOutLoggerFactoryAdapter { Level = Common.Logging.LogLevel.Info};
Maybe it's a problem of time.
I've had the same problem as you, and I live in a country which time is UTC + 2. So, when I set the StartTimeUtc to the trigger, I used DateTime.Now, so the trigger didn't have to fire until two hours later, and I thought it has to be fired in the very moment my code started.
Look carefully the time of the trigger's execution and its StartTime
Another possibility is the way you're running the scheduler. I'm not totally sure, but you may run into problems trying to run a scheduling threads in an ASP.NET application. Putting the SchedulerFactory/Scheduler objects in a button click handler doesn't seem like it would give you the desired results.
You may need to create the scheduler at a more "global" level, so that it can run in the "background" of the application. It might also make sense to move any scheduled work into a separate windows service, so that you don't have to maintain the scheduler in the web app.
When you had success in the past, how were you invoking the scheduler?
In my case, there was an issue with IoC - there were some Interfaces that weren't implemented. I could see what was wrong with mine by adding logging:
Common.Logging.LogManager.Adapter = new Common.Logging.Simple.ConsoleOutLoggerFactoryAdapter { Level = Common.Logging.LogLevel.Info};
to Program.cs as suggested by Andy White

Resources