Symfony2 background task - symfony

I would create a background task that loops continuously with Symfony2
I want to know how to do this
protected function execute(){
while (true) {
sleep(60);
//------------ do some think
}}

As far as I know, PHP doesn't have threading, so you won't be able to run that in parallel with your Symfony application.
Instead, create a cron job which executes that code via CLI.
Here's a pretty straight-forward intro to cron jobs.
If you're on a Windows server, set up a scheduled task. See How to run a PHP file in a scheduled task (Windows Task Scheduler).

Related

Job Sensors in Databricks Workflows

At the moment we schedule our Databricks notebooks using Airflow. Due to dependencies between projects, there are dependencies between DAGs. Some DAGs wait until a task in a previous DAG is finished before starting (by using sensors).
We are now looking to use Databricks DBX. It is still new for us, but it seems that DBX' main added value is when you use Databricks workflows. It would be possible to run a Python wheel in a job that was created by DBX. My question is now, is it possible to add dependencies between Databricks jobs? Can we create 2 different jobs using DBX, and make the second job wait until the first one is completed.
I am aware that I can have dependencies between tasks in one job, but in our case it is not possible to have only one job with all the tasks.
I was thinking about adding a notebook/python script before the wheel with ETL logic. This notebook would check then if the previous job is finished. Once this is the case, the task with the wheel will be executed. Does this make sense, or are there better ways? Is something like the ExternalTaskSensor in Airflow available within Databricks workflows?
Or is there a good way to use DBX without DB workflows?
author of dbx here.
TL;DR - dbx is not opinionated in terms of the orchestrator choice.
It is still new for us, but it seems that DBX' main added value is when you use Databricks workflows. It would be possible to run a Python wheel in a job that was created by DBX.
The short answer is yes, but it's done on the tasks level (read more here on the difference between workflow and task).
Another approach would be the following - if you still need (or want) to use Airflow, you can do it in the following way:
Deploy and update your jobs from your CI/CD pipeline with dbx deploy commands.
In Airflow, use the Databricks Operator to launch the job (either by name or by id).

Scheduling hangfire jobs in a different project to where they are executed

I have 2 .net core web projects.
One of them is called ScheduledJobs and it uses Hangfire with the dashboard to both schedule and process jobs.
The other is called ClientWebsite and it schedules the jobs only - but I dont want them executing here!
ScheduledJobs works fine, if I schedule anything from there it picks them up and processes them.
But since I need to be able to schedule jobs from clientWebsite too, I have to have the following settings in startup:
services.AddHangfire(x => x.UseSqlServerStorage(Configuration.GetConnectionString("DefaultConnection"));
services.AddHangfireServer();
If I dont call services.AddHangfireServer it wont even let me schedule them.
But if I add it, then it processes them too which I dont want !
Please help! Thanks
You shouldn't need to register the hangfire service at all in the second project in this way.
If you want to purely queue jobs from it you can use the GlobalConfiguration to set up which database it should point at similar to
GlobalConfiguration.Configuration.UseSqlServerStorage(Configuration.GetConnectionString("DefaultConnection"));
Once you have done this you can register a BackgroundJobClient similar to this (this is taken from an autofac example so depending on your DI it wont be exactly the same as the line below)
builder.RegisterType<BackgroundJobClient>().As<IBackgroundJobClient>();
What this then allows you to do is resolve and enqueue jobs using the IBackgroundJobClient in your application without setting up a hangfire server at all.
In your classes where you want to enqueue jobs from you can then simple resolve an instance of IBackgroundJobClient and make use of the Enqueue method such as
_myClient.Enqueue<MyJobClass>(x => x.APublicMethodOnMyJobClass());
Details on the backgroundjobclient can be found here - BackgroundJobClient

Modify the task schedule in Airflow

I want to modify the schedule of a task I created in a dags/ folder through the airflow UI. I can't find a way to modify the schedule through the UI. Can it be done or we can get it done only by modifying the python script ?
The only way to change it is through the code. As it's part of the DAG definition (like tasks and dependencies), it appears to be difficult to be able to change it through the web interface.

Doctrine Migrations Bundle - how to run migration on multiple servers at the same time?

This may not be possible at the moment but if anyone had the same problem, how did you handle it?
Is it possible to run the migrations on multiple servers at the same time without running the same scripts multiple times?
The problem I am having is that we are using multiple servers and they run the migrations every time we deploy a new version of our APP. This causes that the same migration scripts are being run several times (depending on how many servers are running it).
Is there a way to check whether the migration is in progress and if yes, skip it or this is something I would need to implement manually?
Many thanks.
This sounds like something you would need to implement manually.
I suggest having a script that executes once when you deploy your app that SSH into one of your servers and executes the migration.
I would recommend using Ansible to write a playbook to handle this, while calling all the relevant Hosts (inventories).
The end result would be something like (for example):
If you only wanted to run on a single (or subset list of..):
ansible-playbook --limit YOUR_INVENTORY_NAME run-migrations.yml
Or, for all of them as defined:
ansible-playbook run-migrations.yml
And your actual playbook within Ansible would look something like:
- name: Run Migrations
command: php bin/console doctrine:migrations:migrate
args:
chdir: /path/to/symfony

How to create a cron task in Symfony2

I have a little question, how create a simple cron task who call some service action in Symfony, who could be executed automatically each night ?
Symfony2 does not manage cron tasks, simply because this is system level. That being said, you can create a command and register it as a cron task.
You can use CommandSchedulerBundle
You can now use TaskScheduleBundle to make your cron jobs within symfony.

Resources