airflow pass jinja template to dag parameters - airflow

I am trying to pass jinja template to the DAG constructor. As the airflow best practice suggest don't call Variables outside the execute method.
Below is the code snippet.
dag = DAG(
dag_id=dag_id,
schedule_interval=schedule_interval,
dagrun_timeout=timedelta(hours=max_dagrun),
template_searchpath='{{var.value.sql_path}}')
But its failing to parse this.
Any suggestion how to pass these type of variables ? These are passed to dags not to the airflow operators.
Thanks in advance.

Related

Airflow - Notification on failure of the whole dag

I'm trying to implement a way to get notified when my dag fails.
I tried to use the email_on_failure and a webhook method ( https://code.mendhak.com/Airflow-MS-Teams-Operator/ ).
But for both of them, I got a notification for every task that failed.
Is there a way to get notified only if the whole dag doesn't work?
I really appreciate any help you can provide.
You can choose to set on_failure_callback on operator level or on DAG level.
On Dag - A function to be called when a DagRun of this dag fails.
On Operator - a function to be called when a task instance
of this task fails.
In your case you need to set on_failure_callback in your DAG object:
dag = DAG(
dag_id=dag_id,
on_failure_callback=func_to_execute,
)

Is there a way to pass a parameter to an airflow dag when triggering it manually

I have an airflow DAG that is triggered externally via cli.
I have a requirement to change order of the execution of tasks based on a Boolean parameter which I would be getting from the CLI.
How do I achieve this?
I understand dag_run.conf can only be used in a template field of an operator.
Thanks in advance.
You can not change tasks dependency with runtime parameter.
However you can pass runtime parameter (with dag_run.conf) that according to it's value tasks will be executed or be skipped for that you need to place operators in your workflow that can handle this logic for example: ShortCircuitOperator, BranchPythonOperator

How to run an Airflow DAG that is defined in a test?

I am trying to write a test case where I:
instantiate a collection of (Python)Operators (patching some of their dependencies with unittest.mock.patch)
arrange those Operators in a DAG
run that DAG
make assertions about the calls to various mocked downstreams
I see from here that running a DAG is not so simple as calling dag.run - I should instantiate a local_client and call trigger_dag on that. However, the resultant code constructs its own DagBag, and does not accept any parameter that allows me to pass in my manually-constructed DAG - so I cannot see how to run this DAG with local_client.
I see a couple of options here:
I could declare my testing DAG in a separate folder, as specified by DagModel.get_current(dag_id).fileloc, so that my DAG will get picked up by trigger_dag and so run - but this seems pretty indirect, and also I doubt that I'd be able to cleanly reference the injected mocks from my test code.
I could directly call api.common.experimental.trigger_dag._trigger_dag, which has a dag_bag argument. Both the experimental in the name, and the underscored-prefixed method name, suggest that this would be A Bad Idea.

AirFlow - disable dag after X consecutive fails

I read the API reference and couldnt find anything on it, is that possible?
Currently, there is no such feature that does it out-of-the-box but you can write some custom code in your DAG to get around this. For example, use PythonOperator (you can use MySQL operator if your metadata db is mysql) to get status of the last X runs for the dag.
use BranchPythonOperator to see if the number is more than X, if it is then use a BashOperator to run airflow pause dag cli.
You can also just make this a 2-step task by adding logic of PythonOperator in BranchPythonOperator. This is just an idea, you can use a different logic.

Passing parameters to Airflow's jobs through UI

Is it possible to pass parameters to Airflow's jobs through UI?
AFAIK, 'params' argument in DAG is defined in python code, therefore it can't be changed at runtime.
Depending on what you're trying to do, you might be able to leverage Airflow Variables. These can be defined or edited in the UI under the Admin tab. Then your DAG code can read the value of the variable and pass the value to the DAG(s) it creates.
Note, however, that although Variables let you decouple values from code, all runs of a DAG will read the same value for the variable. If you want runs to be passed different values, your best bet is probably to use airflow templating macros and differentiate macros with the run_id macro or similar
Two ways to change your DAG behavior:
Use Airflow variables like mentioned by Bryan in his answer.
Use Airflow JSON Conf to pass JSON data to a single DAG run. JSON can be passed either from
UI - manual trigger from tree view
UI - create new DAG run from browse > DAG runs > create new record
or from
CLI
airflow trigger_dag 'MY_DAG' -r 'test-run-1' --conf '{"exec_date":"2021-09-14"}'
Within the DAG this JSON can be accessed using jinja templates or in the operator callable function context param.
def do_some_task(**context):
print(context['dag_run'].conf['exec_date'])
task1 = PythonOperator(
task_id='task1_id',
provide_context=True,
python_callable=do_some_task,
dag=dag,
)
#access in templates
task2 = BashOperator(
task_id="task2_id",
bash_command="{{ dag_run.conf['exec_date'] }}",
dag=dag,
)
Note that the JSON conf will not be present during scheduled runs. The best use case for JSON conf is to override the default DAG behavior. Hence set meaningful defaults in the DAG code so that during scheduled runs JSON conf is not used.

Resources