Background
I am trying to run a DAG at 10pm America/New_York once every day from Monday to Friday. The script which the DAG runs takes as input the day it runs on for its
time zone (10pm Mon-Fri). When I run this scrip as an Airflow DAG, the date is derived from the macro {{ ds_nodash }}
The problem
When Airflow runs, by the time it's 10pm NY time, it's already the next day on UTC time. Since Airflow uses UTC, the execution date is one day ahead, so when my DAG uses the macro {{ ds_nodash }}, it is one day ahead.
Question:
Is there a way to get the time-zone adjusted date as a macro on airflow or is the only solution to my problem to adjust the macro myself?
According to the airflow documentation, the default variables (such as {{ ds_nodash }}) are in UTC. So, we need to convert them ourselves. It can go something like this:
# ...
local_ds_nodash = '{{ dag.timezone.convert(execution_date).strftime("%Y%m%d") }}'
# ...
create_file = BashOperator(
task_id='create_file',
bash_command=f'touch {local_ds_nodash}.txt'
)
I supposed that you may mess up with two different concepts in airflow.
Actually, 'ds' is not the date which the tasks are running, it is the previous period of tasks running. for example, for ds is 3/25/2019, it would be running on 3/26/2019 rather than 3/25. So if you want your tasks running exactly on Mon-Fri, you need to set the schedule_interval as '0 22 * * 1-5'. The weekday settings should be '1-5' instead of '2-6'.
For timezone, kaxil's answer has explained very well. But if for some reason, you cannot change the airflow server configuration, what you need to do is to adjust schedule_interval as '0 2 * * 2-6'. Then the tasks will run as you expected.
Timezone feature is now available in Airflow. Have a look at https://airflow.readthedocs.io/en/1.10.2/timezone.html and adjust your config in airflow.cfg accordingly.
By default it is
[core]
default_timezone = utc
adjust it to
[core]
default_timezone = America/New_York
The execution_date will then contain TZ info as well which you would be able to extract. Try it in a test environment before you roll out to your production environment.
Related
Apache Airflow 1.10+ introduced native support for DST aware timezones.
This leads me to think (perhaps incorrectly) it should be possible to create 2 DAGs on the same Airflow scheduler that are scheduled like so:
Starts every day at 06:00 Pacific/Auckland time
Starts every day at 21:00 America/New_York time
Without the need to introduce tasks that "sleep" until the required start time. The documentation explicitly rules out the cron scheduler for DST aware scheduling but only explains how to set the DAGs to run every day in that timezone, which by default is midnight.
Previous questions on this topic have considered only using the cron scheduler or are based on pre-1.10 airflow which did not have the introduced native support for DST aware timezones.
In the "airflow.cfg" I updated the default_timezone to the system timezone. And then I tried to schedule the DAGs like so:
DAG('NZ_SOD',
description='New Zealand Start of Day',
start_date=datetime(2018, 12, 11, 06, 00, tzinfo=pendulum.timezone('Pacific/Auckland')),
catchup=False)
And:
DAG('NAM_EOD',
description='North Americas End of Day',
start_date=datetime(2018, 12, 11, 21, 00, tzinfo=pendulum.timezone('America/New_York')),
catchup=False)
But it seems that the "Time" part of the datetime object that is passed to start_date is not explicitly considered in Apache Airflow and creates unexpected behavior.
Does Airflow have any in built option to produce desired behavior or am I trying to use the wrong tool for the job?
The answer is yes, the cron schedule supports having DAGs run in DST aware timezones.
But there are a number of caveats so I have to assume the maintainers of Airflow do not have this as a supported use case. Firstly the documentation, as of the time of writing, is explicitly wrong when it states:
Cron schedules
In case you set a cron schedule, Airflow assumes you will always want to run at the exact same time. It will then ignore day light savings time. Thus, if you have a schedule that says run at end of interval every day at 08:00 GMT+1 it will always run end of interval 08:00 GMT+1, regardless if day light savings time is in place.
I've written this somewhat hacky code which let's you see how a schedule will work without the need for a running Airflow instance (be careful you have Penulum 1.x installed and using the correct documentation if you run or edit this code):
import pendulum
from airflow import DAG
from datetime import timedelta
# Set-up DAG
test_dag = DAG(
dag_id='foo',
start_date=pendulum.datetime(year=2019, month=4, day=4, tz='Pacific/Auckland'),
schedule_interval='00 03 * * *',
catchup=False
)
# Check initial schedule
execution_date = test_dag.start_date
for _ in range(7):
next_execution_date = test_dag.following_schedule(execution_date)
if next_execution_date <= execution_date:
execution_date = test_dag.following_schedule(execution_date + timedelta(hours=2))
else:
execution_date = next_execution_date
print('Execution Date:', execution_date)
This gives us a 7 day period over which New Zealand experiences DST:
Execution Date: 2019-04-03 14:00:00+00:00
Execution Date: 2019-04-04 14:00:00+00:00
Execution Date: 2019-04-05 14:00:00+00:00
Execution Date: 2019-04-06 14:00:00+00:00
Execution Date: 2019-04-07 15:00:00+00:00
Execution Date: 2019-04-08 15:00:00+00:00
Execution Date: 2019-04-09 15:00:00+00:00
As we can see DST is observed using the cron schedule, further if you edit my code to remove the cron schedule you can see that DST is not observed.
But be warned, even with the cron schedule observing DST you may still have an out by 1 day error and on the day of the DST change because Airflow is providing the previous date and not the current date (e.g. Sunday on the Calendar but in Airflow the execution date is Saturday). It doesn't look to me like this is accounted for in the follow_schedule logic.
Finally as #dlamblin points out the variables that Airflow provides to the jobs, either via templated strings or provide_context=True for Python callables will be the wrong if the local execution date for the DAG is not the same as the UTC execution date. This can be observed in TaskInstance.get_template_context which uses self.execution_date without modifying it to be in local time. And we can see in TaskInstance.__init__ that self.execution_date is converted to UTC.
The way I handle this is to derive a variable I call local_cal_date by doing what #dlamblin suggests and using the convert method from Pendulum. Edit this code to fit your specific needs (I actually use it in a wrapper around all my Python callables so that they all receive local_cal_date):
import datetime
def foo(*args, dag, execution_date, **kwargs):
# Derive local execution datetime from dag and execution_date that
# airflow passes to python callables where provide_context is set to True
airflow_timezone = dag.timezone
local_execution_datetime = airflow_timezone.convert(execution_date)
# I then add 1 day to make it the calendar day
# and not the execution date which Airflow provides
local_cal_datetime = local_execution_datetime + datetime.timedelta(days=1)
Update: For templated strings I found for me the best approach was to create custom operators that injected the custom varaibles in to the context before the template is rendered. The problem I found with using custom macros is they don't expand other macros automatically, which means you have to do a bunch of extra work to render them in a useful way. So in a custom operators module I some similar to this code:
# Standard Library
import datetime
# Third Party Libraries
import airflow.operators.email_operator
import airflow.operators.python_operator
import airflow.operators.bash_operator
class CustomTemplateVarsMixin:
def render_template(self, attr, content, context):
# Do Calculations
airflow_execution_datetime = context['execution_date']
airflow_timezone = context['dag'].timezone
local_execution_datetime = airflow_timezone.convert(airflow_execution_datetime)
local_cal_datetime = local_execution_datetime + datetime.timedelta(days=1)
# Add to contexts
context['local_cal_datetime'] = local_cal_datetime
# Run normal Method
return super().render_template(self, attr, content, context)
class BashOperator(CustomTemplateVarsMixin, airflow.operators.bash_operator.BashOperator):
pass
class EmailOperator(CustomTemplateVarsMixin, airflow.operators.email_operator.EmailOperator):
pass
class PythonOperator(CustomTemplateVarsMixin, airflow.operators.python_operator.PythonOperator):
pass
class BranchPythonOperator(CustomTemplateVarsMixin, airflow.operators.python_operator.BranchPythonOperator):
pass
First a few nits:
Don't specify datetimes with a leading 0 like 06 am because if you edit it to 9am in a rush, you're going to find out that that's not a valid octal number and the whole DAG file will stop parsing.
You might as well use the pendulum notation: start_date=pendulum.datetime(2018, 12, 11, 6, 0, tz='Pacific/Auckland'),
Yeah timezones in Airflow got a little confusing. The docs say that a cron schedule is always in that timezone's offset. This isn't as clear as it should be because, offsets vary. Lets assume you set the default config timezone like this:
[core]
default_timezone = America/New_York
With a start_date like:
start_date = datetime(2018, 12, 11, 6, 0),
you get the offset with UTC of -18000 or -5h.
start_date = datetime(2018, 4, 11, 6, 0),
you get the offset with UTC of -14400 or -4h.
Where as the one in the second bullet point gives an offset of 46800 or 13h, while in April in Auckland it is 43200 or 12h. These get applied to the schedule_interval for the DAG if I recall correctly.
What the docs seem to say is your schedule_interval crontab string will be interpreted forever in that same offset. So, a 0 5 * * * is going to run at 5 or 6 am if you started in December in NYC OR 5 or 4 am if you started in April in NYC. Uh. I think that's right. I am also confused by this.
This isn't avoided by leaving the default at utc. No, not if you use the start_date as you've shown and picked zones with varying offsets to utc.
Now… the second issue, time of day. The start date is used to be the earliest start interval that's valid. A time of day being in there is great but the schedule defaults to timedelta(days=1). I thought it was #daily which also means 0 0 * * *, and gives you fun results like starting at a start date of 6am 11th of December, your first full midnight-to-midnight interval will close at midnight 13th of December, thus the first run gets passed in the date of midnight of 12th December as the execution_date. But I would expect that with a timedelta being applied to the start_date it would instead start 6am on the 12th December with the same time yesterday passed in as the execution_date. However I've not seen it work out that way, which does make me think that it might be using only the date part of the datetime for start_date somewhere.
As documented, this passed in exeucution_date (and all macro dates) are going to be in UTC (so midnight or 6am in your start_date timezone offset, converted to UTC). At least they have the tz attached so you can use convert on them if you must.
In Airflow, I'd like a job to run at specific time each day in a non-UTC timezone. How can I go about scheduling this?
The problem is that once daylight savings time is triggered, my job will either be running an hour too soon or an hour too late. In the Airflow docs, it seems like this is a known issue:
In case you set a cron schedule, Airflow assumes you will always want
to run at the exact same time. It will then ignore day light savings
time. Thus, if you have a schedule that says run at end of interval
every day at 08:00 GMT+1 it will always run end of interval 08:00
GMT+1, regardless if day light savings time is in place.
Has anyone else run into this issue? Is there a work around? Surely the best practice cannot be to alter all the scheduled times after Daylight Savings Time occurs?
Thanks.
Starting with Airflow 1.10, time-zone aware DAGs can be defined using time-zone aware datetime objects to specify start_date. For Airflow to schedule DAG runs always at the same time (regardless of a possible daylight-saving-time switch), use cron expressions to specify schedule_interval. To make Airflow schedule DAG runs with fixed intervals (regardless of a possible daylight-saving-time switch), use datetime.timedelta() to specify schedule_interval.
For example, consider the following code that, first, uses a cron expression to schedule two consecutive DAG runs, and then uses a fixed interval to do the same.
import pendulum
from airflow import DAG
from datetime import datetime, timedelta
START_DATE = datetime(
year=2019,
month=10,
day=25,
hour=8,
minute=0,
tzinfo=pendulum.timezone('Europe/Kiev'),
)
def gen_execution_dates(start_date, schedule_interval):
dag = DAG(
dag_id='id', start_date=start_date, schedule_interval=schedule_interval
)
execution_date = dag.start_date
for i in range(1, 3):
execution_date = dag.following_schedule(execution_date)
print(
f'[Run {i}: Execution Date for "{schedule_interval}"]:',
dag.timezone.convert(execution_date),
)
gen_execution_dates(START_DATE, '0 8 * * *')
gen_execution_dates(START_DATE, timedelta(days=1))
Running the code produces the following output:
[Run 1: Execution Date for "0 8 * * *"]: 2019-10-26 08:00:00+03:00
[Run 2: Execution Date for "0 8 * * *"]: 2019-10-27 08:00:00+02:00
[Run 1: Execution Date for "1 day, 0:00:00"]: 2019-10-26 08:00:00+03:00
[Run 2: Execution Date for "1 day, 0:00:00"]: 2019-10-27 07:00:00+02:00
For the zone [Europe/Kiev], the daylight saving time of 2019 ends on 2019-10-27 at 03:00:00+03:00. That is, between Run 1 and Run 2 in our example.
The first two output lines show that for the DAG runs scheduled with a cron expression the first run and second run are both scheduled for 08:00 (although, in different timezones: Eastern European Summer Time (EEST) and Eastern European Time (EET) respectively).
The last two output lines show that for the DAG runs scheduled with a fixed interval the first run is scheduled for 08:00 (EEST), and the second run is scheduled exactly 1 day (24 hours) later, which is at 07:00 (EET) due to the daylight-saving-time switch.
The following figure illustrates the example:
The problem: Airflow's execution_date is defined as the beginning of the period between runs. For example, a DAG run on a weekly schedule would run on 2018-01-08 T11:00:00, but the execution_date would be 2018-01-01 T11:01:00.
The objective: I receive a file once a week, with the file date in the file's name. To identify the file, I'd like to use Airflow's execution_date. But I cannot seem to find a way to use the date of the run, versus using the earliest possible execution_date for a period.
Possible solutions:
Modify the execution_date on the fly. Something like: context['execution_date'] + timedelta(days=7). This seems hacky.
Run the DAG daily, insert a ShortCircuitOperator at the beginning of the DAG execution graph, exit if execution_date is not the expected date.
All suggestions or recommendations are welcomed. It's a nuanced problem, but causing some issues with my ETL pipeline.
Another possible solution?
Have the DAG run once a week just after you "think" the file will arrive. Parse the names of the files in the landing area which will give you a bunch of dates. Check and see which of these dates is between the execution_date + schedule_interval (or next_execution_date if you're using airflow version >= 1.8). Then ingest file/s which match.
I think using execution_date + timedelta(days=7) is a bit hacky, intead use the execution_date + schedule_interval, that way if the interval changes there shouldn't be any issues (I do this for one of my DAGS). If you're using a newer airflow version then you can use the next_execution_date which is better.
I'm using macro for this issue.
This function (for macro) can handle manual trigger, too.
def weekly_today(execution_date, run_id, years=0, months=0, days=0, fmt="%Y%m%d"):
d = pendulum.instance(execution_date)
if run_id.startswith('scheduled_'):
d = d.add(days=7)
return d.add(years=years, months=months, days=days).strftime(fmt)
This function should be added to DAG as user_defined_macros
dag = DAG(
dag_id='test',
start_date=timezone.datetime(2019, 6, 24, 6),
schedule_interval=timedelta(days=7),
user_defined_macros={
'weekly_today': weekly_today
},
)
And I needed to set data range from 1 year ago to today.
Here is sample macro usage.
from_macro = '{{ weekly_today(execution_date, run_id, years=-1) }}'
to_macro = '{{ weekly_today(execution_date, run_id) }}'
bad naming.. but works.
in airflow, I would like to run a dag each monday at 8am (the execution_date should be of course "current day monday 8 am"). The relevant parameters to set up for this workflow are :
start_date : "2018-03-19"
schedule_interval : "0 8 * * MON"
I expect to see a dag run every monday at 8am . The first one being run the 19-03-2018 at 8 am with execution_date = 2018-03-19-08-00-00 and so on each monday.
However it's not what happens : the dag is not started on 19/03/18 at 8 am. The real behaviour is explained here for exemple : https://stackoverflow.com/a/39620901/1510109 or https://stackoverflow.com/a/48213964/1510109
The behaviour is : at each end of the interval ( weekly in my case) the dag is run with execution_date = beginning of the interval (i.e the previous week). This behavour is apparently motivated by an "ETL way of thinking" (see the link above). But it's absolutely not what I want.
How what can I achieve to run my dag each monday at 08:00am with execution_date = trigger_date = now ( = current monday 8am) ?
Thanks
Take a quick look at my answer with start times and execution_date examples.
You want to run every Monday at 8am.
So this part is going to stay the same:
schedule_interval: '0 8 * * MON',
You want it to run it's first run on 2018-03-19, since the first run occurs at the end of the first full schedule period after the start date, you should change your start date to:
start_date: datetime(2018,03,12),
You will have to live with the fact that Airflow will name your DagRuns with the start of each period and pass in macros based on the execution_date set to the start of the interval period. Adjust your logic accordingly.
Your first run will start after 2018-03-19T08:00:00.0Z and the execution_date, every other macro that depends on it, and name of the DagRun will be 2018-03-12T08:00:00.0Z
So long as you understand what to expect from the execution_date and you don't try to base your time off of datetime.now() your DAGs will be able to be idempotent in operation. Feel free to make a new variable like my_execution_date = execution_date + datetime.timedelta(7) within any PythonOperator or custom operator (you get execution_date from the context of the task), use template statements like {{ (execution_date + macros.timedelta(7)).strftime('%Y%m%d') }} or {{ macros.ds_add(ds, 7) }}, or use the next_execution_date.
You can even add a dag level user_defined_macros like {'dt':lambda d: d+datetime.timedelta(days=7)} to enable {{ dt(execution_date) }}. And recently user_defined_filters were added like {'dt':lambda d: d+datetime.timedelta(days=7)} enabling {{ execution_date | dt }}. The next_ds and next_execution_date would be easier for your purposes.
While thinking about templating, you may as well read up on the built-in stuff out there: http://jinja.pocoo.org/docs/2.10/templates/#builtin-filters
That is how airflow behaves, it always runs when the duration is completed. Detailed behavior here and airflow faq.
But in order to somehow make it run for current week, what we can do is manipulate execution_date of DAG. That may be in form of adding 7 days to a datetime object (if weekly schedule) or may use {{ next_execution_date }} macro.
Agreed that this is only possible if somehow dates are used in your DAG or dependencies are triggered by it.
Just to be clear again, DAG is still running as per its normal behavior. Only thing what we trying to do is manipulate date in program/DAG.
args = { ....
'start_date': datetime.datetime(2018,3,18)
}
dag = DAG(...
schedule_interval = "#weekly"
)
# DAG would run on 3/25/2018 for week of 18th March
# but lets say we manipulate here
# {{ next_execution_date }} macro
# or add 7 days
# So basically we are running with date 3/25/2018 instead of 3/18/2018 for the week of 18th March
For me I solved it in this way:
{{ ds if dag_run.external_trigger or dag_run.is_backfill else macros.ds_add(ds, 1) }}
If DAG was run by external trigger we shouldn't change ds.
If DAG was run by backfilling we shouldn't change ds.
If DAG was scheduled we use macros to increment it by one day.
When building an Airflow dag, I typically specify a simple schedule to run periodically - I expect this is the most common use.
dag = DAG('my_dag',
description='this is what it does',
schedule_interval='0 12 * * *',
start_date=datetime(2017, 10, 1),
catchup=False)
I then need to use the 'date' as a parameter in my actual process, so I just check the current date.
date = datetime.date.today()
# do some date-sensitive stuff
operator = MyOperator(..., params=[date, ...])
My understanding is that setting catchup=True will have Airflow schedule my dag for every schedule interval between start_date and now (or end_date); e.g. every day.
How do I get the scheduled_date for use within my dag instance?
I think you mean execution date here, You can use Macros inside your operators, more detail can be found here: https://airflow.apache.org/code.html#macros. So airflow will respect it so you don't need to have your date been generated dynamically
Inside of Operator, you can call {{ ds }} in a str directly
Outside of Operator, for example PythonOperator, you will need provide_context=True first then to pass **kwargs as last arguments to your function then you can call kwargs['ds']