Airflow - how to get all the future run date - airflow

I am working on scheduling a airflow job. However to verify if I have scheduled the correct job I need to see when it will run in the future.
Airflow has following command which gives me the next run. however, that's not sufficient for some use cases. for example, I have scheduled a job run every other Friday. How do I verify that.
airflow next_execution <dag_id>
Is there a way, I can get all the future dates when this dag will run. or atleast couple of ?

While most processes use croniter, if you have access to your installation it's always best to get the information from the "source" via existing interfaces:
from airflow import models
from datetime import datetime, timedelta
dag_bag = models.DagBag()
dag_id = "dag_name"
dag = dag_bag.get_dag(dag_id)
now = datetime.now()
until = now + timedelta(days=21)
runs = dag.get_run_dates(start_date=now, end_date=until)
print(runs)

Airflow uses under the hook croniter, for an example. Following the example on the documentation of croniter, this could work as follows (as example, consider that the dag run at 12 pm every Friday and that our base date is yesterday the 20th of August).
from croniter import croniter
from datetime import datetime
# Specify current date
base = datetime(2020, 8, 20, 0, 0)
# Set croniter
iter = croniter('0 12 * * 5', base)
# Get next execution
iter.get_next(datetime)
>>>
datetime.datetime(2020, 8, 21, 12, 0)
where you can specify base as the latest execution date of your dag (dag.latest_execution_date). And you can get it's following executions by calling n times iter.get_next(datetime).

Related

modify dag runs when triggered

I was curious if there's a way to customise the dag runs.
So I'm currently checking for updates for another table which gets updated manually by someone and once that's been updated, I would run my dag for the month.
At the moment I have created a branch operator that compares the dates of the 2 tables but is there a way to run the dag (compare the two dates) and run it everyday until there is a change and not run for the remaining of the month?
For example,
Table A (that is updated manually) has YYYYMM as 202209 and Table B also has YYYYMM as 202209.
Atm, my branch operator compares the two YYYYMM and would point to a dummy operator end when it's the same. However, when Table A has been updated to 202210, there's a difference in the two YYYYMM hence another task would run and overwrite Table B.
It all works but this would run the dag everyday even though the table A only gets updated once a month at a random point of time within the month. So is there way to trigger the dag to stop for the remaining days of the month after the task has been triggered?
Hope this is clear.
If you would be using data stored on S3 there would be easy solution starting from the version 2.4 - the Data-aware scheduling.
But probably you're not so there is another option.
A dag in Airflow is Dag object that is assigned to global scope. This allows for dynamic creation of dags. This implies each file is loaded on certain interval. A very good description with examples is here
Second thing you need to use is Airflow Variables
So the concept is as follows:
Create a variable in Airflow named dag_run that will hold the month when the dag has successfully run
Create a python file that has a function that creates a dag object based on input parameters.
In the same file use conditional statements that will set the 'schedule' param differently depending if the dag has run for current month
In your dag in the branch that executes when data has changed set the variable dag_run to the current months value like so: Variable.set(key='dag_run', value=datetime.now().month)
step 1:
python code:
from airflow import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
from airflow.models import Variable
#function that creates dag based on input
def create_dag(dag_id,
schedule,
default_args):
def hello_world_py(*args):
print('Hello World')
print('This is DAG: {}'.format(str(dag_id)))
dag = DAG(dag_id,
schedule_interval=schedule,
default_args=default_args)
with dag:
t1 = PythonOperator(
task_id='hello_world',
python_callable=hello_world_py)
return dag
#run some checks
current_month = datetime.now().month
dag_run_month = int(Variable.get('run_month'))
if current_month == dag_run_month:
# keep the schedule off
schedule = None
dag_id = "Database_insync"
elif current_month != dag_run_month:
# keep the schedule on
schedule = "30 * * * *"
dag_id = "Database_notsynced"
#watch out for start_date if you leave
#it in the past airflow will execute past missing schedules
default_args = {'owner': 'airflow',
'start_date': datetime.datetime.now() - datetime.timedelta(minutes=15)
}
globals()[dag_id] = create_dag(dag_id,
schedule,
default_args)

Airflow - How to properly define the time my DAG will execute every day?

I have two dags: The first one extracts data from one database to another. I want it to run everyday at 4 AM and then, that's how I defined my params:
Note: The code has 7am instead of 4 because Airflow is in UTC time and my time is GMT-3.
start_date=datetime(2022, 9, 6),
schedule_interval="0 7 * * *",
catchup=False
) as dag:
But, when I check Airflow's UI, the DAG time is shown like this:
screenshot
I have no idea about or have seen this Data Interval, why is it starting at today 21:00 (9 PM) and why this is the next run for this DAG?
How do I set my DAG to run at the next day (Sep 6, as I'm posting on 5) at 4 AM?
Thank you!

Does Apache Airflow 1.10+ scheduler support running 2 DAGs in different DST aware time-zones at specific times?

Apache Airflow 1.10+ introduced native support for DST aware timezones.
This leads me to think (perhaps incorrectly) it should be possible to create 2 DAGs on the same Airflow scheduler that are scheduled like so:
Starts every day at 06:00 Pacific/Auckland time
Starts every day at 21:00 America/New_York time
Without the need to introduce tasks that "sleep" until the required start time. The documentation explicitly rules out the cron scheduler for DST aware scheduling but only explains how to set the DAGs to run every day in that timezone, which by default is midnight.
Previous questions on this topic have considered only using the cron scheduler or are based on pre-1.10 airflow which did not have the introduced native support for DST aware timezones.
In the "airflow.cfg" I updated the default_timezone to the system timezone. And then I tried to schedule the DAGs like so:
DAG('NZ_SOD',
description='New Zealand Start of Day',
start_date=datetime(2018, 12, 11, 06, 00, tzinfo=pendulum.timezone('Pacific/Auckland')),
catchup=False)
And:
DAG('NAM_EOD',
description='North Americas End of Day',
start_date=datetime(2018, 12, 11, 21, 00, tzinfo=pendulum.timezone('America/New_York')),
catchup=False)
But it seems that the "Time" part of the datetime object that is passed to start_date is not explicitly considered in Apache Airflow and creates unexpected behavior.
Does Airflow have any in built option to produce desired behavior or am I trying to use the wrong tool for the job?
The answer is yes, the cron schedule supports having DAGs run in DST aware timezones.
But there are a number of caveats so I have to assume the maintainers of Airflow do not have this as a supported use case. Firstly the documentation, as of the time of writing, is explicitly wrong when it states:
Cron schedules
In case you set a cron schedule, Airflow assumes you will always want to run at the exact same time. It will then ignore day light savings time. Thus, if you have a schedule that says run at end of interval every day at 08:00 GMT+1 it will always run end of interval 08:00 GMT+1, regardless if day light savings time is in place.
I've written this somewhat hacky code which let's you see how a schedule will work without the need for a running Airflow instance (be careful you have Penulum 1.x installed and using the correct documentation if you run or edit this code):
import pendulum
from airflow import DAG
from datetime import timedelta
# Set-up DAG
test_dag = DAG(
dag_id='foo',
start_date=pendulum.datetime(year=2019, month=4, day=4, tz='Pacific/Auckland'),
schedule_interval='00 03 * * *',
catchup=False
)
# Check initial schedule
execution_date = test_dag.start_date
for _ in range(7):
next_execution_date = test_dag.following_schedule(execution_date)
if next_execution_date <= execution_date:
execution_date = test_dag.following_schedule(execution_date + timedelta(hours=2))
else:
execution_date = next_execution_date
print('Execution Date:', execution_date)
This gives us a 7 day period over which New Zealand experiences DST:
Execution Date: 2019-04-03 14:00:00+00:00
Execution Date: 2019-04-04 14:00:00+00:00
Execution Date: 2019-04-05 14:00:00+00:00
Execution Date: 2019-04-06 14:00:00+00:00
Execution Date: 2019-04-07 15:00:00+00:00
Execution Date: 2019-04-08 15:00:00+00:00
Execution Date: 2019-04-09 15:00:00+00:00
As we can see DST is observed using the cron schedule, further if you edit my code to remove the cron schedule you can see that DST is not observed.
But be warned, even with the cron schedule observing DST you may still have an out by 1 day error and on the day of the DST change because Airflow is providing the previous date and not the current date (e.g. Sunday on the Calendar but in Airflow the execution date is Saturday). It doesn't look to me like this is accounted for in the follow_schedule logic.
Finally as #dlamblin points out the variables that Airflow provides to the jobs, either via templated strings or provide_context=True for Python callables will be the wrong if the local execution date for the DAG is not the same as the UTC execution date. This can be observed in TaskInstance.get_template_context which uses self.execution_date without modifying it to be in local time. And we can see in TaskInstance.__init__ that self.execution_date is converted to UTC.
The way I handle this is to derive a variable I call local_cal_date by doing what #dlamblin suggests and using the convert method from Pendulum. Edit this code to fit your specific needs (I actually use it in a wrapper around all my Python callables so that they all receive local_cal_date):
import datetime
def foo(*args, dag, execution_date, **kwargs):
# Derive local execution datetime from dag and execution_date that
# airflow passes to python callables where provide_context is set to True
airflow_timezone = dag.timezone
local_execution_datetime = airflow_timezone.convert(execution_date)
# I then add 1 day to make it the calendar day
# and not the execution date which Airflow provides
local_cal_datetime = local_execution_datetime + datetime.timedelta(days=1)
Update: For templated strings I found for me the best approach was to create custom operators that injected the custom varaibles in to the context before the template is rendered. The problem I found with using custom macros is they don't expand other macros automatically, which means you have to do a bunch of extra work to render them in a useful way. So in a custom operators module I some similar to this code:
# Standard Library
import datetime
# Third Party Libraries
import airflow.operators.email_operator
import airflow.operators.python_operator
import airflow.operators.bash_operator
class CustomTemplateVarsMixin:
def render_template(self, attr, content, context):
# Do Calculations
airflow_execution_datetime = context['execution_date']
airflow_timezone = context['dag'].timezone
local_execution_datetime = airflow_timezone.convert(airflow_execution_datetime)
local_cal_datetime = local_execution_datetime + datetime.timedelta(days=1)
# Add to contexts
context['local_cal_datetime'] = local_cal_datetime
# Run normal Method
return super().render_template(self, attr, content, context)
class BashOperator(CustomTemplateVarsMixin, airflow.operators.bash_operator.BashOperator):
pass
class EmailOperator(CustomTemplateVarsMixin, airflow.operators.email_operator.EmailOperator):
pass
class PythonOperator(CustomTemplateVarsMixin, airflow.operators.python_operator.PythonOperator):
pass
class BranchPythonOperator(CustomTemplateVarsMixin, airflow.operators.python_operator.BranchPythonOperator):
pass
First a few nits:
Don't specify datetimes with a leading 0 like 06 am because if you edit it to 9am in a rush, you're going to find out that that's not a valid octal number and the whole DAG file will stop parsing.
You might as well use the pendulum notation: start_date=pendulum.datetime(2018, 12, 11, 6, 0, tz='Pacific/Auckland'),
Yeah timezones in Airflow got a little confusing. The docs say that a cron schedule is always in that timezone's offset. This isn't as clear as it should be because, offsets vary. Lets assume you set the default config timezone like this:
[core]
default_timezone = America/New_York
With a start_date like:
start_date = datetime(2018, 12, 11, 6, 0),
you get the offset with UTC of -18000 or -5h.
start_date = datetime(2018, 4, 11, 6, 0),
you get the offset with UTC of -14400 or -4h.
Where as the one in the second bullet point gives an offset of 46800 or 13h, while in April in Auckland it is 43200 or 12h. These get applied to the schedule_interval for the DAG if I recall correctly.
What the docs seem to say is your schedule_interval crontab string will be interpreted forever in that same offset. So, a 0 5 * * * is going to run at 5 or 6 am if you started in December in NYC OR 5 or 4 am if you started in April in NYC. Uh. I think that's right. I am also confused by this.
This isn't avoided by leaving the default at utc. No, not if you use the start_date as you've shown and picked zones with varying offsets to utc.
Now… the second issue, time of day. The start date is used to be the earliest start interval that's valid. A time of day being in there is great but the schedule defaults to timedelta(days=1). I thought it was #daily which also means 0 0 * * *, and gives you fun results like starting at a start date of 6am 11th of December, your first full midnight-to-midnight interval will close at midnight 13th of December, thus the first run gets passed in the date of midnight of 12th December as the execution_date. But I would expect that with a timedelta being applied to the start_date it would instead start 6am on the 12th December with the same time yesterday passed in as the execution_date. However I've not seen it work out that way, which does make me think that it might be using only the date part of the datetime for start_date somewhere.
As documented, this passed in exeucution_date (and all macro dates) are going to be in UTC (so midnight or 6am in your start_date timezone offset, converted to UTC). At least they have the tz attached so you can use convert on them if you must.

Airflow: re execute the jobs of a DAG for the past n days on a daily basis

I have scheduled the execution of a DAG to run daily.
It works perfectly for one day.
However each day I would like to re-execute not only for the current day {{ ds }} but also for the previous n days (let's say n = 7).
For example, in the next execution scheduled to run on "2018-01-30" I would like Airflow not only to run the DAG using as execution date "2018-01-30", but also to re-run the DAGs for all the previous days from "2018-01-23" to "2018-01-30".
Is there an easy way to "invalidate" the previous execution so that a backfill is run automatically?
You can generate dynamically tasks in a loop and pass the offset to your operator.
Here is an example with the Python one.
import airflow
from airflow.operators.python_operator import PythonOperator
from airflow.models import DAG
from datetime import timedelta
args = {
'owner': 'airflow',
'start_date': airflow.utils.dates.days_ago(2),
'schedule_interval': '0 10 * * *'
}
def check_trigger(execution_date, day_offset, **kwargs):
target_date = execution_date - timedelta(days=day_offset)
# use target_date
for day_offset in xrange(1, 8):
PythonOperator(
task_id='task_offset_' + i,
python_callable=check_trigger,
provide_context=True,
dag=dag,
op_kwargs={'day_offset' : day_offset}
)
Have you considered having the dag that runs once a day just run your task for the last 7 days? I imagine you’ll just have 7 tasks that each spawn a SubDAG with a different day offset from your execution date.
I think that will make debugging easier and history cleaner. I believe trying to backfill already executed tasks will involve deleting task instances or setting their states all to NONE. Then you’ll still have to trigger a backfill on those dag runs. It’ll be harder to track when things fail and just seems a bit messier.

How can I retrieve the 'scheduled time' for catchup jobs in Airflow?

When building an Airflow dag, I typically specify a simple schedule to run periodically - I expect this is the most common use.
dag = DAG('my_dag',
description='this is what it does',
schedule_interval='0 12 * * *',
start_date=datetime(2017, 10, 1),
catchup=False)
I then need to use the 'date' as a parameter in my actual process, so I just check the current date.
date = datetime.date.today()
# do some date-sensitive stuff
operator = MyOperator(..., params=[date, ...])
My understanding is that setting catchup=True will have Airflow schedule my dag for every schedule interval between start_date and now (or end_date); e.g. every day.
How do I get the scheduled_date for use within my dag instance?
I think you mean execution date here, You can use Macros inside your operators, more detail can be found here: https://airflow.apache.org/code.html#macros. So airflow will respect it so you don't need to have your date been generated dynamically
Inside of Operator, you can call {{ ds }} in a str directly
Outside of Operator, for example PythonOperator, you will need provide_context=True first then to pass **kwargs as last arguments to your function then you can call kwargs['ds']

Resources