What start_date should I use for a manually triggered DAG? - airflow

Many of the airflow example dags that have schedule_interval=None set a dynamic start date like airflow.utils.dates.days_ago(2) or datetime.utcnow(). However, the docs recommend against a dynamic start date:
We recommend against using dynamic values as start_date, especially datetime.now() as it can be quite confusing. The task is triggered once the period closes, and in theory an #hourly DAG would never get to an hour after now as now() moves along.
Is start date irrelevant for manually triggered dags? What is the best practice here?

I always try to set the start date for manually triggered DAGS as the day I first ran it so that I know when the DAG would have first been run for reference in the future.

If you have a schedule_interval=None I believe the start_date is irrelevant as airflow will not attempt to perform any back filling. Just set it to anything even if it's a dynamic one it shouldn't cause any hassle.

I ended up just setting start_date to 1970, Jan 1st (absurdly far in the past) so that Airflow never complains that the execution date is before the start date.

Related

How to get dag start and end time and dag run duration in airflow?

I am trying to get the dag start time and end time to calculate the duration/elapsed time and show it in airflow UI.
I tried with python date time but looks like airflow already records these things. I want to know if there is any way to leverage that.
I don't want to get the details from the database because it will complicate things. I want to keep it simple.

run 2 scripts in same DAG with different schedule

Let's say you have 2 scripts: Daily_summary.py and Weekly_summary.py.
You could create 2 separate DAGs with daily and weekly schedules, but is it possible to solve this with 1 DAG?
I've tried a daily schedule, and simply putting this at the bottom (simplified):
if datetime.today().strftime('%A') == 'Sunday':
SSHOperator(run weekly_summary.py)
But problem is that if it is still running on Sunday at midnight, airflow will terminate this task since the Operator no longer exists on Monday.
If I could somehow get the execution day's day-of-the-week, that would solve it, but with Jinja templating '{{ds}}' it is not actually a text of 'yyyy-mm-dd', so cannot change it to date with datetime package. It only becomes date format somehow AFTER the airflow script gets executed
You shoudl dynamically generate two DAGs. But you can reuse the same code for that. This is the power of airflow - this is Python code, so you can easily use the same code to generate same DAG "structure" but with two diferent dag ids, and two different schedules.
See this nice article from Astronomer with some examples: https://www.astronomer.io/guides/dynamically-generating-dags

Airflow 1.10.10 Schedule change for DAG

I am using Airflow 1.10.10 and wanted to know how to change a Aiflow DAG schedule . I checked online and in most of the comments its suggested that to change schedule of a DAG, create a new DAG with new dag_id, or change dag_id of existing DAG and give new schedule_interval . Attempt to change schedule of a existing DAG will not work in straight forward manner and will throw error or might create scheduling error.
However I tried to test this so that I can create the scenario where my DAG schedule change leads to erroneous cases . This I tried by only change schedule_interval in DAG file. I tried below change of schedule in my DAG and all worked as expected. Schedule was changed properly and no erroneous case was found .
Started with #Daily
Changed to 10 min
Changed to 17 min
Changed to 15 min
Changed to 5 min
Can someone please clarify what kind of problem may arise if we change the schedule_interval in a DAG without changing ID.
I do see this recommendation on the old Airflow Confluence page on Common Pitfalls.
When needing to change your start_date and schedule interval, change the name of the dag (a.k.a. dag_id) - I follow the convention : my_dag_v1, my_dag_v2, my_dag_v3, my_dag_v4, etc...
Changing schedule interval always requires changing the dag_id, because previously run TaskInstances will not align with the new schedule interval
Changing start_date without changing schedule_interval is safe, but changing to
an earlier start_date will not create any new DagRuns for the time
between the new start_date and the old one, so tasks will not
automatically backfill to the new dates. If you manually create
DagRuns, tasks will be scheduled, as long as the DagRun date is after
both the task start_date and the dag start_date.
I don't know the author's intent but I imagine changing the schedule_interval can cause confusion for users. When they revisit these task, they will wonder why the current schedule_interval does not match past task executions because that information is not stored at the task level.
Changing the schedule_interval does not impact past dagruns or tasks. The change will affect when new dagruns are created, which impacts the tasks within those dagruns.
I personally do not modify the dag_id when I update a DAG's scheduler_interval for two reasons.
If I keep the previous DAG, I am unnecessarily inducing more stress on the scheduler for processing a DAG that will not be turned on.
If I do not keep the previous DAG, I essentially lose all the history of the dagrun where it had a different schedule_interval.
Edit: Looks like there is an Github Issue created to move the Common Pitfall page but it is stale.

Will airflow pick up dynamically generated schedule interval?

I've been running airflow 1.9.0 and using dynamically generated schedule intervals.
Simply put, I take a US/Eastern timestamp from some config file, get the current system timezone (can be either EDT or EST), and convert it to UTC time then to a cron expression.
For example, if I launch the dag today (2018-07-23, EDT) and my input is 6AM US/Eastern, it will result in a dag whose schedule interval is 10AM UTC or 0 10 * * 1-5.
My question is:
if I leave the dag running on a daily basis, will its schedule automatically update to 0 11 * * 1-5 in November, when daylight savings ends?
I specifically want to avoid using tz-aware datetimes in scheduling these dags, that's why I came up with this hacky way of timestamp conversion.
What library or code are you using to do the conversion between your Eastern timestamp and generate the cron expression? I think answering this part of your question is dependent on that information.
Anyway, this idea kind of sounds like a code smell to me. While it would technically work, assuming you library supports that use case correctly and that the timezone library is kept up-to-date, I believe you're better off going with the standard route of determining the crontab schedule you want upfront and using that consistently.
It's also a best practice to not use local time zone, for example, in the case where you move your server from Eastern to Pacific or operate multiple servers in different timezones — using UTC everywhere keeps it simple as you scale up.
Since UTC does not have daylight saving time, this will help you avoid things like DST bugs that you'd otherwise have to address if not using UTC.
Additionally, the official Airflow docs recommend against using naive datetimes:
Because Airflow uses time-zone-aware datetime objects. If your code creates datetime objects they need to be aware too.
...
Although Airflow operates fully time zone aware, it still accepts naive date time objects for start_dates and end_dates in your DAG definitions. This is mostly in order to preserve backwards compatibility.
...
Unfortunately, during DST transitions, some datetimes don’t exist or are ambiguous. In such situations, pendulum raises an exception. That’s why you should always create aware datetime objects when time zone support is enabled.
https://github.com/apache/incubator-airflow/blob/master/docs/timezone.rst
Can you elaborate on your use case for using naive datetimes vs timezone aware datetimes? I'd be happy to add more specific advice about that.

Why is it recommended against using a dynamic start_date in Airflow?

I've read Airflow's FAQ about "What's the deal with start_date?", but it still isn't clear to me why it is recommended against using dynamic start_date.
To my understanding, a DAG's execution_date is determined by the minimum start_date between all of the DAG's tasks, and subsequent DAG Runs are ran at the latest execution_date + schedule_interval.
If I set my DAG's default_args start_date to be for, say, yesterday at 20:00:00, with a schedule_interval of 1 day, how would that break or confuse the scheduler, if at all? If I understand correctly, the scheduler would trigger the DAG with an execution_date of yesterday at 20:00:00, and the next DAG Run would be scheduled for today at 20:00:00.
Is there some concept that I'm missing?
First run would be at start_date+schedule_interval. It doesn't run dag on start_date, it always runs on start_date+schedule_interval.
As they mentioned in document if you give start_date dynamic for e.g. datetime.now() and give some schedule_interval(1 hour), it will never execute that run as now() moves along with time and datetime.now()+ 1 hour is not possible
The scheduler expects to see a constant start date and interval. If you change it the scheduler might not notice until it reloads the DagBag, and if the new start date doesn't line up with your old schedule it might break depends_on_past behavior.
If you don't need depends_on_past the simplest might be to stop using the scheduler, set the start date to some arbitrary old date, and externally trigger the DAG however you like using a crontab or similar.

Resources