Airflow: Dynamic SubDag creation - airflow

I have a use case where I have a list of clients. The client can be added or removed from the list, and they can have different start dates, and different initial parameters.
I want to use airflow to backfill all data for each client based on their initial start date + rerun if something fails. I am thinking about creating a SubDag for each client. Will this address my problem?
How can I dynamically create SubDags based on the client_id?

You can definitely create DAG objects dynamically:
def make_client_dag(parent_dag, client):
return DAG(
'%s.client_%s' % (parent_dag.dag_id, client.name),
start_date = client.start_date
)
You could then use that method in a SubDagOperator from your main dag:
for client in clients:
SubDagOperator(
task_id='client_%s' % client.name,
dag=main_dag,
subdag = make_client_dag(main_dag, client)
)
This will create a subdag specific to each member of the collection clients, and each will run for the next invocation of the main dag. I'm not sure if you'll get the backfill behavior you want.

Related

How to trigger email for airflow task run

We have an adhoc airflow DAG, which anyone can trigger to run manually from team of 50+.
We can check airflow audit logs who triggered the DAG via dag id and we can also get email upon failure.
But we are more curious to know if we can get email upon DAG start OR at the start of each task run, this will help us understand and track activity and usage/command executed from adhoc DAG.
#Khilesh Chauhan
There are a number of ways to achieve your intended outcome, in no particular order:
1. Task or Dag Level Callbacks
Official Callback Reference
We can take advantage of the on_success_callback callback, which can be harnessed in two distinct places.
# use a function inside a specific PythonOperator, for task level control
task = PythonOperator(
task_id='your_task',
on_success_callback=send_mail,
)
# use it inside your DAG initiation
dag = DAG(
dag_id='your_task',
on_failure_callback=send_mail
)
We can write an example send_mail function, which leverages the send_email utility.
from airflow.utils.email import send_email
def send_mail(**context):
task = context['task_instance'].task
subject = f'Airflow task has successfully completed {task.task_id}'
body = f'Hi, this is an alert to let you know that your task {task.task_id} has completed successfully.'
send_email(
dag.default_args['email'],
subject,
body
)
2. Add an EmailOperator to your DAG Official Email Operator Reference
You could add an EmailOperator task at the beginning of your DAG.
from airflow.operators.email_operator import EmailOperator
email = EmailOperator(
task_id='alert_DAG_start',
to='your#email.come',
subject='DAG Initiated - start {{ ds }}',
html_content=""" <h1>Some Content</h1> """
)
3. Create a function that uses a PythonOperator that executes send_email
You might need more control, such as including logging info etc. So you might want more control to use a PythonOperator.
I hope this helps you to resolve your problem.
Update
To answer your second question to get the username, I have created a function for you to use. We can import the session context manager, then use the .query method, for your debugging purposes, I loop through the array. you can see the username at index 3.
from airflow.models.log import Log
from airflow.utils.db import create_session
def return_user_name(**context):
"""
return the username for the executed tasks
"""
dag_id = context['task_instance'].dag_id
with create_session() as session:
result = session.query(Log.dttm, Log.dag_id, Log.execution_date, Log.owner, Log.extra).filter(Log.dag_id == dag_id, Log.event == 'trigger').first()
for index, result in enumerate(result):
print(index, result)

Create dynamic workflows in Airflow with XCOM value

Now, I create multiple tasks using a variable like this and it works fine.
with DAG(....) as dag:
body = Variable.get("config_table", deserialize_json=True)
for i in range(len(body.keys())):
simple_task = Operator(
task_id = 'task_' + str(i),
.....
But I need to use XCOM value for some reason instead of using a variable.
Is it possible to dynamically create tasks with XCOM pull value?
I try to set value like this and it's not working
body = "{{ ti.xcom_pull(key='config_table', task_ids='get_config_table') }}"
It's possible to dynamically create tasks from XComs generated from a previous task, there are more extensive discussions on this topic, for example in this question. One of the suggested approaches follows this structure, here is a working example I made:
sample_file.json:
{
"cities": [ "London", "Paris", "BA", "NY" ]
}
Get your data from an API or file or any source. Push it as XCom.
def _process_obtained_data(ti):
list_of_cities = ti.xcom_pull(task_ids='get_data')
Variable.set(key='list_of_cities',
value=list_of_cities['cities'], serialize_json=True)
def _read_file():
with open('dags/sample_file.json') as f:
data = json.load(f)
# push to XCom using return
return data
with DAG('dynamic_tasks_example', schedule_interval='#once',
start_date=days_ago(2),
catchup=False) as dag:
get_data = PythonOperator(
task_id='get_data',
python_callable=_read_file)
Add a second task which will pull from pull from XCom and set a Variable with the data you will use to iterate later on.
preparation_task = PythonOperator(
task_id='preparation_task',
python_callable=_process_obtained_data)
*Of course, if you want you can merge both tasks into one. I prefer not to because usually, I take a subset of the fetched data to create the Variable.
Read from that Variable and later iterate on it. It's critical to define default_var.
end = DummyOperator(
task_id='end',
trigger_rule='none_failed')
# Top-level code within DAG block
iterable_list = Variable.get('list_of_cities',
default_var=['default_city'],
deserialize_json=True)
Declare dynamic tasks and their dependencies within a loop. Make the task_id uniques. TaskGroup is optional, helps you sorting the UI.
with TaskGroup('dynamic_tasks_group',
prefix_group_id=False,
) as dynamic_tasks_group:
if iterable_list:
for index, city in enumerate(iterable_list):
say_hello = PythonOperator(
task_id=f'say_hello_from_{city}',
python_callable=_print_greeting,
op_kwargs={'city_name': city, 'greeting': 'Hello'}
)
say_goodbye = PythonOperator(
task_id=f'say_goodbye_from_{city}',
python_callable=_print_greeting,
op_kwargs={'city_name': city, 'greeting': 'Goodbye'}
)
# TaskGroup level dependencies
say_hello >> say_goodbye
# DAG level dependencies
get_data >> preparation_task >> dynamic_tasks_group >> end
DAG Graph View:
Imports:
import json
from airflow import DAG
from airflow.utils.dates import days_ago
from airflow.models import Variable
from airflow.operators.python_operator import PythonOperator
from airflow.operators.dummy import DummyOperator
from airflow.utils.task_group import TaskGroup
Things to keep in mind:
If you have simultaneous dag_runs of this same DAG, all of them will use the same variable, so you may need to make it 'unique' by differentiating their names.
You must set the default value while reading the Variable, otherwise, the first execution may not be processable to the Scheduler.
The Airflow Graph View UI may not refresh the changes immediately. Happens especially in the first run after adding or removing items from the iterable on which the dynamic task generation is created.
If you need to read from many variables, it's important to remember that it's recommended to store them in one single JSON value to avoid constantly create connections to the metadata database (example in this article).
Good luck!
Edit:
Another important point to take into consideration:
With this approach, the call to Variable.get() method is top-level code, so is read by the scheduler every 30 seconds (default of min_file_process_interval setting). This means that a connection to the metadata DB will happen each time.
Edit:
Added if clause to handle emtpy iterable_list case.
This is not possible, and in general dynamic tasks are not recommended:
The way the Airflow scheduler works is by reading the dag file, loading the tasks into the memory and then checks which dags and which tasks it need to schedule, while xcom are a runtime values that are related to a specific dag run, so the scheduler cannot relay on xcom values.
When using dynamic tasks you're making debug much harder for yourself, as the values you use for creating the dag can change and you'll lose access to logs without even understanding why.
What you can do is use branch operator, to have those tasks always and just skip them based on the xcom value.
For example:
def branch_func(**context)
return f"task_{context['ti'].xcom_pull(key=key)}"
branch = BranchPythonOperator(
task_id="branch",
python_callback=branch_func
)
tasks = [BaseOperator(task_id=f"task_{i}") for i in range(3)]
branch >> tasks
In some cases it's also not good to use this method (for example when I've 100 possible tasks), in those cases I'd recommend writing your own operator or use a single PythonOperator.

Airflow Access Variable From Previous Python Operator

I'm new to Airflow and I'm currently building a DAG that will execute a PythonOperator, a BashOperator, and then another PythonOperator structured like this:
def authenticate_user(**kwargs):
...
list_prev = [...]
AUTHENTICATE_USER = PythonOperator(
task_id='AUTHENTICATE_USER',
python_callable=authenticate_user,
provide_context=True,
dag=dag)
CHANGE_ROLE = BashOperator(
task_id='CHANGE_ROLE',
bash_command='...',
dag=dag)
def calculations(**kwargs):
list_prev
...
CALCULATIONS = PythonOperator(
task_id='CALCULATIONS',
python_callable=calculations,
provide_context=True,
dag=dag)
My issue is, I create a list of variables in the first PythonOperator (AUTHENTICATE_USER) that I would like to use later in my second PythonOperator (CALCULATIONS) after executing the BashOperator (CHANGE_ROLE). Is there a way for me to carry over that created list into other PythonOperators in my current DAG?
Thank you
I can think of 3 possible ways (to avoid confusion with the Airflow's concept of Variable, I'll call the data that you want to share between tasks as values)
Airflow XCOMs: Push your values from AUTHENTICATE_USER task and pull them in your CALCULATIONS task. You can either publish and access each value separately or wrap them all into a Python dict or list (better as it reduces db reads and writes)
External system: Persist your values from 1st task into some external system such as database, files or S3-objects and access them from downstream tasks when needed
Airflow Variables: This is a specific case of point (2) above (as Variables are stored in Airflow's backend meta-db). You can programmatically create, modify or delete Variables by exploiting the underlying SQLAlchemy model. See this for hints.

Finding out what triggered a task run programmatically

Is there a way to programmatically determine what triggered the current task run of the PythonOperator from inside of the operator?
I want to differentiate between the task runs triggered on schedule, those catching up, and those triggered by the backfill CLI command.
The template context contains two variables: dag_run and run_id that you can use to determine whether the run was scheduled, a backfill, or externally triggered.
from airflow import jobs
def python_target(**context):
is_backfill = context["dag_run"].is_backfill
is_external = context["dag_run"].external_trigger
is_latest = context["execution_date"] == context["dag"].latest_execution_date
# More code...

Apache Airflow - How to retrieve dag_run data outside an operator in a flow triggered with TriggerDagRunOperator

I set up two DAGs, let's call the first one orchestrator and the second one worker. Orchestrator work is to retrieve a list from an API and, for each element in this list, trigger the worker DAG with some parameters.
The reason why I separated the two workflows is I want to be able to replay only the "worker" workflows that fail (if one fails, I don't want to replay all the worker instances).
I was able to make things work but now I see how hard it is to monitor, as my task_id are the same for all, so I decided to have dynamic task_id based on a value retrieved from the API by "orchestrator" workflow.
However, I am not able to retrieve the value from the dag_run object outside an operator. Basically, I would like this to work :
with models.DAG('specific_workflow', schedule_interval=None, default_args=default_dag_args) as dag:
name = context['dag_run'].name
hello_world = BashOperator(task_id='hello_{}'.format(name), bash_command="echo Hello {{ dag_run.conf.name }}", dag=dag)
bye = BashOperator(task_id='bye_{}'.format(name), bash_command="echo Goodbye {{ dag_run.conf.name }}", dag=dag)
hello_world >> bye
But I am not able to define this "context" object. However, I am able to access it from an operator (PythonOperator and BashOperator for instance).
Is it possible to retrieve the dag_run object outside an operator ?
Yup it is possible
What I tried and worked for me is
In the following code block, I am trying to show all possible ways to use the configurations passed,
directly to different operators
pyspark_task = DataprocSubmitJobOperator(
task_id="task_0001",
job=PYSPARK_JOB,
location=f"{{{{dag_run.conf.get('dataproc_region','{config_data['cluster_location']}')}}}}",
project_id="{{dag_run.conf['dataproc_project_id']}}",
gcp_conn_id="{{dag_run.conf.gcp_conn_id}}"
)
So either you can use it like
"{{dag_run.conf.field_name}}" or "{{dag_run.conf['field_name']}}"
Or
If you want to use some default values in case the configuration field is optional,
f"{{{{dag_run.conf.get('field_name', '{local_variable['field_name_0002']}')}}}}"
I don't think it's easily possible currently. For example, as part of the worker run process, the DAG is retrieved without any TaskInstance context provided besides where to find the DAG: https://github.com/apache/incubator-airflow/blob/f18e2550543e455c9701af0995bc393ee6a97b47/airflow/bin/cli.py#L353
The context is injected later: https://github.com/apache/incubator-airflow/blob/c5f1c6a31b20bbb80b4020b753e88cc283aaf197/airflow/models.py#L1479
The run_id of the DAG would be good place to store this information.

Resources