Airflow : dag run with execution_date = trigger_date = fixed_schedule - airflow

in airflow, I would like to run a dag each monday at 8am (the execution_date should be of course "current day monday 8 am"). The relevant parameters to set up for this workflow are :
start_date : "2018-03-19"
schedule_interval : "0 8 * * MON"
I expect to see a dag run every monday at 8am . The first one being run the 19-03-2018 at 8 am with execution_date = 2018-03-19-08-00-00 and so on each monday.
However it's not what happens : the dag is not started on 19/03/18 at 8 am. The real behaviour is explained here for exemple : https://stackoverflow.com/a/39620901/1510109 or https://stackoverflow.com/a/48213964/1510109
The behaviour is : at each end of the interval ( weekly in my case) the dag is run with execution_date = beginning of the interval (i.e the previous week). This behavour is apparently motivated by an "ETL way of thinking" (see the link above). But it's absolutely not what I want.
How what can I achieve to run my dag each monday at 08:00am with execution_date = trigger_date = now ( = current monday 8am) ?
Thanks

Take a quick look at my answer with start times and execution_date examples.
You want to run every Monday at 8am.
So this part is going to stay the same:
schedule_interval: '0 8 * * MON',
You want it to run it's first run on 2018-03-19, since the first run occurs at the end of the first full schedule period after the start date, you should change your start date to:
start_date: datetime(2018,03,12),
You will have to live with the fact that Airflow will name your DagRuns with the start of each period and pass in macros based on the execution_date set to the start of the interval period. Adjust your logic accordingly.
Your first run will start after 2018-03-19T08:00:00.0Z and the execution_date, every other macro that depends on it, and name of the DagRun will be 2018-03-12T08:00:00.0Z
So long as you understand what to expect from the execution_date and you don't try to base your time off of datetime.now() your DAGs will be able to be idempotent in operation. Feel free to make a new variable like my_execution_date = execution_date + datetime.timedelta(7) within any PythonOperator or custom operator (you get execution_date from the context of the task), use template statements like {{ (execution_date + macros.timedelta(7)).strftime('%Y%m%d') }} or {{ macros.ds_add(ds, 7) }}, or use the next_execution_date.
You can even add a dag level user_defined_macros like {'dt':lambda d: d+datetime.timedelta(days=7)} to enable {{ dt(execution_date) }}. And recently user_defined_filters were added like {'dt':lambda d: d+datetime.timedelta(days=7)} enabling {{ execution_date | dt }}. The next_ds and next_execution_date would be easier for your purposes.
While thinking about templating, you may as well read up on the built-in stuff out there: http://jinja.pocoo.org/docs/2.10/templates/#builtin-filters

That is how airflow behaves, it always runs when the duration is completed. Detailed behavior here and airflow faq.
But in order to somehow make it run for current week, what we can do is manipulate execution_date of DAG. That may be in form of adding 7 days to a datetime object (if weekly schedule) or may use {{ next_execution_date }} macro.
Agreed that this is only possible if somehow dates are used in your DAG or dependencies are triggered by it.
Just to be clear again, DAG is still running as per its normal behavior. Only thing what we trying to do is manipulate date in program/DAG.
args = { ....
'start_date': datetime.datetime(2018,3,18)
}
dag = DAG(...
schedule_interval = "#weekly"
)
# DAG would run on 3/25/2018 for week of 18th March
# but lets say we manipulate here
# {{ next_execution_date }} macro
# or add 7 days
# So basically we are running with date 3/25/2018 instead of 3/18/2018 for the week of 18th March

For me I solved it in this way:
{{ ds if dag_run.external_trigger or dag_run.is_backfill else macros.ds_add(ds, 1) }}
If DAG was run by external trigger we shouldn't change ds.
If DAG was run by backfilling we shouldn't change ds.
If DAG was scheduled we use macros to increment it by one day.

Related

schedule a monthly DAG to run on the next weekday

I have to schedule a DAG that should run on 15th of every month. However, if 15th falls on a Sunday/Saturday then the DAG should skip weekends and run on coming Monday.
For example, May 15 2021 falls on a Saturday. So, instead of running on 15th of May, the DAG should run on 17th, which is Monday.
Can you please help to schedule it in airflow?
Thanks in advance!
The logic of scheduling is limited by what you can do with single cron expression. So if you can't say it in cron expression you can't provide such scheduling in Airflow. For that reason there is an open airflow improvement proposal AIP-39 Richer scheduler_interval to give more scheduling capabilities.
That said, you can still get the desired functionality by writing some code.
You can set your dag to start on the 15th of each month and then place a sensor that verify that the date is Mon-Fri (if not it will wait):
from airflow.sensors.weekday import DayOfWeekSensor
dag = DAG(
dag_id='work',
schedule_interval='0 0 15 * *',
default_args=default_args,
description='Schedule a Job on 15 of each month',
)
weekend_check = DayOfWeekSensor(
task_id='weekday_check_task',
week_day={'Monday', 'Tuesday', 'Wednesday', 'Thursday', 'Friday'},
mode='reschedule',
dag=dag)
op_1 = YourOperator(task_id='op1_task',dag=dag)
weekend_check >> op_1
Note: If you are running airflow<2.0.0 you will need to change the import to:
from airflow.contrib.sensors.weekday_sensor import DayOfWeekSensor
The answer posted by Elad works pretty well. I came up with another solution that works as well.
I scheduled the job to run on 15,16 and 17 of the month. However, I added a condition so that the job runs on the 15th if its a weekday. The job runs on 16th and 17th if its a Monday.
To achieve that, I added a BranchPythonOperator:
from airflow.operators.python_operator import BranchPythonOperator
def _conditinal_task_initiator(**kwargs):
execution_date=kwargs['execution_date']
if int(datetime.strftime(execution_date,'%d'))==15 and (execution_date.weekday()<5):
return 'dummy_task_run_cmo_longit'
elif int(datetime.strftime(execution_date,'%d'))==16 and (execution_date.weekday()==0):
return 'dummy_task_run_cmo_longit'
elif int(datetime.strftime(execution_date,'%d'))==17 and (execution_date.weekday()==0):
return 'dummy_task_run_cmo_longit'
else:
return 'dummy_task_skip_cmo_longit'
with DAG(dag_id='NXS_FM_LOAD_CMO_CHOICE_LONGIT',default_args = default_args, schedule_interval = "0 8 15-17 * *", catchup=False) as dag:
conditinal_task_initiator=BranchPythonOperator(
task_id='cond_task_check_day',
provide_context=True,
python_callable=_conditinal_task_initiator,
do_xcom_push=False)
dummy_task_run_cmo_longit=DummyOperator(
task_id='dummy_task_run_cmo_longit')
dummy_task_skip_cmo_longit=DummyOperator(
task_id='dummy_task_skip_cmo_longit')
conditinal_task_initiator >> [dummy_task_run_cmo_longit,dummy_task_skip_cmo_longit]
dummy_task_run_cmo_longit >> <main tasks for execution>
Using this, the job'll run on 15,16 and 17 of every month. However, it'll run the functional tasks only once every month.

Airflow execution interval - is it standard to define the time slice using execution_date and next_execution_date?

I am new to Airflow and have been reading around to try and code my DAGs to fit the standards for the tool. Thanks to plenty of warnings I got the gist around execution_date being at the start of a time slice. Where I have been less sure is in how to handle the end of the time slice.
If I'm running a daily task to process records based on a timestamp, and especially if I want this to be idempotent, then I will need to bound the time slice at both ends. The clearest way to do this is to use execution_date and next_execution_date variables, as in the example below:
from datetime import datetime
from airflow import DAG
from airflow.providers.postgres.operators.postgres import PostgresOperator
dag = DAG(
dag_id='time_slice_example',
start_date=datetime(year=2021, month=2, day=1),
schedule_interval='0 0 * * *'
)
copy_data = PostgresOperator(
owner='airflow',
task_id='copy_time_slice_data',
sql='''
INSERT INTO pipeline_tbl (id, text, other)
SELECT id, text, other FROM daily_tbl
WHERE data_ts >= {{ execution_date }}
AND data_ts < {{ next_execution_date }}
''',
postgres_conn_id='my_db_conn',
dag=dag
)
(I've used a postgres query to illustrate the example but the same variables and principle would apply to any time slice operation)
So my question is whether this is normal? For all of the references to Airflow time slices, I have seen almost no examples of this approach. I can appreciate that it is arguably outside of the scope of Airflow itself, but I wanted to check that this is a standard approach and indeed that I'm not missing something more appropriate.
Yes, using the interval bounded by [execution_date,next_execution_date) is exactly the right behaviour.
In Airflow 2.1 or 2.2 we are investigating making this clearer, possibly by making these parameters be something like data_interval_start and data_interval_end
A bit more detail is happening on https://lists.apache.org/thread.html/rb4e004e68574e5fb77ee5b51f4fd5bfb4b3392d884c178bc767681bf%40%3Cdev.airflow.apache.org%3E
(Source: I am an Airflow core developer.)

How to get airflow time zone information from macros?

Background
I am trying to run a DAG at 10pm America/New_York once every day from Monday to Friday. The script which the DAG runs takes as input the day it runs on for its
time zone (10pm Mon-Fri). When I run this scrip as an Airflow DAG, the date is derived from the macro {{ ds_nodash }}
The problem
When Airflow runs, by the time it's 10pm NY time, it's already the next day on UTC time. Since Airflow uses UTC, the execution date is one day ahead, so when my DAG uses the macro {{ ds_nodash }}, it is one day ahead.
Question:
Is there a way to get the time-zone adjusted date as a macro on airflow or is the only solution to my problem to adjust the macro myself?
According to the airflow documentation, the default variables (such as {{ ds_nodash }}) are in UTC. So, we need to convert them ourselves. It can go something like this:
# ...
local_ds_nodash = '{{ dag.timezone.convert(execution_date).strftime("%Y%m%d") }}'
# ...
create_file = BashOperator(
task_id='create_file',
bash_command=f'touch {local_ds_nodash}.txt'
)
I supposed that you may mess up with two different concepts in airflow.
Actually, 'ds' is not the date which the tasks are running, it is the previous period of tasks running. for example, for ds is 3/25/2019, it would be running on 3/26/2019 rather than 3/25. So if you want your tasks running exactly on Mon-Fri, you need to set the schedule_interval as '0 22 * * 1-5'. The weekday settings should be '1-5' instead of '2-6'.
For timezone, kaxil's answer has explained very well. But if for some reason, you cannot change the airflow server configuration, what you need to do is to adjust schedule_interval as '0 2 * * 2-6'. Then the tasks will run as you expected.
Timezone feature is now available in Airflow. Have a look at https://airflow.readthedocs.io/en/1.10.2/timezone.html and adjust your config in airflow.cfg accordingly.
By default it is
[core]
default_timezone = utc
adjust it to
[core]
default_timezone = America/New_York
The execution_date will then contain TZ info as well which you would be able to extract. Try it in a test environment before you roll out to your production environment.

What's the eloquent way to use the run date for a weekly Airflow job?

The problem: Airflow's execution_date is defined as the beginning of the period between runs. For example, a DAG run on a weekly schedule would run on 2018-01-08 T11:00:00, but the execution_date would be 2018-01-01 T11:01:00.
The objective: I receive a file once a week, with the file date in the file's name. To identify the file, I'd like to use Airflow's execution_date. But I cannot seem to find a way to use the date of the run, versus using the earliest possible execution_date for a period.
Possible solutions:
Modify the execution_date on the fly. Something like: context['execution_date'] + timedelta(days=7). This seems hacky.
Run the DAG daily, insert a ShortCircuitOperator at the beginning of the DAG execution graph, exit if execution_date is not the expected date.
All suggestions or recommendations are welcomed. It's a nuanced problem, but causing some issues with my ETL pipeline.
Another possible solution?
Have the DAG run once a week just after you "think" the file will arrive. Parse the names of the files in the landing area which will give you a bunch of dates. Check and see which of these dates is between the execution_date + schedule_interval (or next_execution_date if you're using airflow version >= 1.8). Then ingest file/s which match.
I think using execution_date + timedelta(days=7) is a bit hacky, intead use the execution_date + schedule_interval, that way if the interval changes there shouldn't be any issues (I do this for one of my DAGS). If you're using a newer airflow version then you can use the next_execution_date which is better.
I'm using macro for this issue.
This function (for macro) can handle manual trigger, too.
def weekly_today(execution_date, run_id, years=0, months=0, days=0, fmt="%Y%m%d"):
d = pendulum.instance(execution_date)
if run_id.startswith('scheduled_'):
d = d.add(days=7)
return d.add(years=years, months=months, days=days).strftime(fmt)
This function should be added to DAG as user_defined_macros
dag = DAG(
dag_id='test',
start_date=timezone.datetime(2019, 6, 24, 6),
schedule_interval=timedelta(days=7),
user_defined_macros={
'weekly_today': weekly_today
},
)
And I needed to set data range from 1 year ago to today.
Here is sample macro usage.
from_macro = '{{ weekly_today(execution_date, run_id, years=-1) }}'
to_macro = '{{ weekly_today(execution_date, run_id) }}'
bad naming.. but works.

How can I retrieve the 'scheduled time' for catchup jobs in Airflow?

When building an Airflow dag, I typically specify a simple schedule to run periodically - I expect this is the most common use.
dag = DAG('my_dag',
description='this is what it does',
schedule_interval='0 12 * * *',
start_date=datetime(2017, 10, 1),
catchup=False)
I then need to use the 'date' as a parameter in my actual process, so I just check the current date.
date = datetime.date.today()
# do some date-sensitive stuff
operator = MyOperator(..., params=[date, ...])
My understanding is that setting catchup=True will have Airflow schedule my dag for every schedule interval between start_date and now (or end_date); e.g. every day.
How do I get the scheduled_date for use within my dag instance?
I think you mean execution date here, You can use Macros inside your operators, more detail can be found here: https://airflow.apache.org/code.html#macros. So airflow will respect it so you don't need to have your date been generated dynamically
Inside of Operator, you can call {{ ds }} in a str directly
Outside of Operator, for example PythonOperator, you will need provide_context=True first then to pass **kwargs as last arguments to your function then you can call kwargs['ds']

Resources