Enqueue controller action process - symfony

Symfony 2.8
Using https://github.com/j-guyon/CommandSchedulerBundle to manage periodic Command executions.
Each of these Command executions invokes an specific Service based on the Command arguments.
Being in the Services (all of them implementing the same Interface and extending an Abstract class), the plan is to create and execute sub-processes (asynchronously if possible)
Based in your experience, which will be the best way to deal with that sub-processes?
Create a Process object (based on a Controller Action) for each sub-process, and run them synchronously (https://symfony.com/doc/2.8/components/process.html)
Use kind of Queue Bundle to deal with all of them (Process or Messages or whatever), such https://php-enqueue.github.io/symfony or https://github.com/armetiz/LeezyPheanstalkBundle (any other suggestion?)
Cheers!

Related

Add job inside another job for APSchedule

I am using APScheduler in a FastAPI application. I have a table, where I stored my async tasks. Then, using APScheduler, I would like to read this table every hour and add new jobs (one per row of the table) to the queue. These jobs are light, so using celery is overkilling I feel. However, I have difficulty in starting a job inside another job for APScheduler.
So, the question is how a job can be added inside another job? Any ideas or help are appreciated.
I've also had a problem with this and a simple solution is to have the scheduler as a global variable available in the job's module. You might also have to use the memory store, as it allows for passing unserializable objects.
You can also pass the scheduler to the job, as long as the job is being run on an executor that doesn't require the arguments to be serialized.
For example, you can have a periodic job running on the asyncio executor that gets the scheduler as an argument, and adds further jobs to it, that are to be run with the ProcessPool executor.
For more info, refer to this discussion in the Gitter channel

How Can You Run CRaSH Commands Or A Script On Node Startup?

I need to initialise my Corda nodes by running a few flows to create certain states.
At the moment I am doing it via the CRaSH shell.
e.g.
flow start IOUFlow iouValue: 50, counterparty: Bank1
Is it possible to have the node run a script or some commands on node startup to do this automatically?
If not, how can I write a bash script to automate these CRaSH commands?
Corda 4.4 introduces a new feature to register actions to be performed on node startup.
You could register an action to be performed on node startup using a CordaService.
appServiceHub.register(
AppServiceHub.SERVICE_PRIORITY_NORMAL,
event -> {
// Your custom code to be run on startup.
}
);
You might want to check on the event type to keep it future proof, but currently the ServiceLifecycleEvent just has a single STATE_MACHINE_STARTED enum.

How to get workflow taskId in Alfresco process service using script task

I wanted to get workflow taskId in a script task variable(Java Script/Groovy) and want's display it on user form.
Please let me know if you have any Idea regarding this.
we are using Alfresco process service 1.9 version
Thanks in Advance.
Store the taskId in a process variable using a *ExecutionListener. create a spring bean that implements the activiti Execution Listener, in the overriden method notify(DelegateExecution execution) set your variable like:
execution.setVariable("your_var", your_var_value);
In the Script Task you can access process variables using the Execution. e.g.:
execution.getVariable("your_var");
follow the developer series for more details.

How to get the user who initiated the process in IBM BPM 8.5?

How to get the user who initiated the process in IBM BPM 8.5. I want to reassign my task to the user who actually initiated the process. How it can be achieved in IBM BPM?
There are several ways to get that who initiated a Task , But who initiated a process Instance is somewhat different.
You can perform one out of the following :
Add a private variable and assign it tw.system.user_loginName at the POST of start. you can access that variable for user who initiated the process.(It will be null or undefined for the scenario if task is initiated by some REST API or UCA.)
Place a Tracking group after Start event . Add a input variable to it as username , assign it a value same as tw.system.user_loginName. So whenever Process is started entry will be inserted to DB Table.You can retrieve this value from that view in PerformanceDB.
Also there might be some table ,logging the process Instances details , where you can find the user_id directly.
I suggest you to look in getStarter() method of ProcessInstanceData API.
Official Documentation on API
This link on IBM Developerworks should help you too: Process Starter
Unfortunately there's not an Out Of The Box way to do this - nothing is recorded in the Process Instance that indicates "who" started a process. I presume this is because there are many ways to launch a process instance - from the Portal, via a Message Event, from an API call, etc.
Perhaps the best way to handle this is to add a required Input parameter to your BPD, and supply "who" started the process when you launch it. Unfortunately you can't supply any inputs from the OOTB Portal "New", but you can easilty build your own "launcher".
If you want to route the first task in process to the user that started the process the easiest approach is to simply put the start point in the lane, and on the activity select routing to "Last User In Lane". This will take care of the use case for you without requiring that you do the book keeping to track the user.
Its been a while since I've implemented this, so I can't remember if it will work elegantly if you have system steps before the first task, but this can easily be handled by moving the system steps into the human service to be executed as part of that call, rather than as a separate step in the BPD.
Define variable as string type and using script task to define the login user that use this task and assign it to your defined variable to keep to you in all of the process as initiator of the task.
You can use this line of code to achieve the same:
tw.system.user_loginName

Doctrine2: Cannot find concurrently persisted entity with findById

I have the current setup:
A regular Symfony2 web request can create and persist Job entity which also creates a Gearman Job, lets say this occurs in process 1. The Gearman Job is executed by a Gearman Worker which is passed the Job entity's ID.
I also use Symfony to create a Gearman Worker, this is run as a PHP CLI process, lets call this process 2.
For those not familiar with Gearman the worker code operates something like so:
for loop 5 times
get job from gearman (blocking method call)
get job entity from database
do stuff
Essentially this code keeps a Symfony2 instance running to handle 5 Jobs before the worker dies.
My issue is this: On the first job that the worker handles Doctrine2 is able to retrieve the created job from the database without issue using the following code:
$job = $this->doctrine
->getRepository('AcmeJobBundle:Job')
->findOneById($job->workload()); // workload is the job id
However, once this job completes and the for loop increments to wait for a second job, lets say this arrives from another Symfony2 web request on process 3 creating the Job with ID 2, the call to the Doctrine2 repository returns null even though the entity is definitely in the database.
Restarting the worker solves the issue, so when it carries out it's first loop it can pick up Job 2.
Does anyone know why this happens? Does the first call of getRepository or findOneById do some sort of table caching from MySQL that doesn't allow it to see the subsequently added Job 2?
Does MySQL only show a snapshot of the DB to a given connection as long as it is held open?
I've also tried resetting the entityManager before making the second call to findOneBy to no avail.
Thanks for any advice in advance, this one is really stumping me.
Update:
I've created a single process test case to rule out whether or not it was the concurrency causing the problem, and the test case executes as expected. It seems the only time the repository can't find job 2 is when it is added to the DB on another process.
// Job 1 already exists
$job = $this->doctrine
->getRepository('AcmeJobBundle:Job')
->findOneById(1);
$job->getId(); // this is fine.
$em->persist(new Job()); // creates job 2
$em->flush();
$job = $this->doctrine
->getRepository('AcmeJobBundle:Job')
->findOneById(2);
$job->getId(); // this is fine too, no exception.
Perhaps one process tries to load entity before it has saved by the second process.
Doctrine caches loaded entities by their id, so that when you get a second request for the same object it loads without making another query to the database. You can reed more about Doctrine IdentityMap here

Resources