Any idea there is any hook to run a task when DAG is stopped from the airflow UI?
Something different than the callbacks offered on a task? Theres the on success callback, on failure callback, and on retry callback.
Related
I followed the docs and created slack function:
It does work and I get notifications in channel, but get the name of the task and link to log are to another task, and not to the one that gets failed.
It gets the context of the upstream failed task, but not the failed task itself:
I tried with different operators and hooks, but get the same result.
If anyone could help, I would really appreciate it.
Thank you!
The goal of the on_failure_callback argument on the Dag level, is running this callback once when the DagRun fails, so we provide the context of the DagRun which is identical between the task instances, for that we don't care which task instance context we provide (I think we provide the context of the last defined task in the dag regardless its state).
If you want to run the callback on each failed ti, you can remove the on_failure_callback argument from the dag and add it to the default args: default_args=dict(on_failure_callback=task_fail_slack_alert).
I need some code to be executed only if a task has been manually cleared and restarted.
Therefore my question: How can I find out, if the task is currently on retry during its run? Does Airflow set some attributes of the task or DAG when I clear a task?
Thanks!
I have a running dag, Can I stop it using REST API ?
There is a method to delete a running dag:
https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/delete_dag_run。
But There is not REST API how to stop a running dag?
Thanks!
You can set the running tasks to failed with Set a state of task instances note that this will stop the task from running and set the task status to failed but it will not invoke the on_failure_callable if you set one on your tasks nor it will invoke the on_kill method of the operator (if there is one).
If you want the DAG to not schedule any more tasks you can set it to Pause with Update a DAG
I have a task in Airflow that uses bash operator which is running continuously.
Is there any way to make the task instance as success before its next scheduled run? I know how to mark it as success in api.
Thank you for your replies
If you just want to call the api automatically to mark the previous dagrun as success, you can use SimpleHttpOperator as the first task of your dag. This operator might call airflow REST API to request to mark the previous dagrun as success.
https://airflow.apache.org/docs/apache-airflow-providers-http/stable/operators.html#simplehttpoperator
I am having a task which will listen for certain events and kick-off other functions.
This function (the listener) subscribes to a kafka topic and runs forever, or at least until it will get a 'stop' event.
Wrapping this as an airflow operator doesn't seem to work properly.
Meaning, if I send the stop event, it will not process it, or anything else for that matter.
Is it possible to run busy loop functions in airflow ?
No, do not run infinite loops in an Airflow task.
Airflow is designed as a batch processor - long/infinite running tasks are counter to it's entire scheduling and processing model and while it might "work", it will lock up a task runner slot.