Dynamically created dags get deleted automatically from Airflow UI - airflow

I am quite new to Apache Airflow so I am not sure if all features of airflow.
My requirement is to create the dynamic dags for the entries which are newly added to database. The criteria for checking if anything is newly added to DB is if the dag_id is updated against that entry or not. If not then we need to create a new dag for it and update the dag_id against that entry.
New entry in database with Dag_id as Null
Id :d001 ,
Dag_id :Null
Once main dag (which is built to create dynamic dags) runs and fetches this entry, a dag is created and dag_id is updated back to database.
Id : d001 ,
Dag_id : dagid-001
The challenge I am facing is once the main dags runs again for fetching newly added entries (this time we won't be retrieving d001). The dynamic dags(dagid-001) which were previously created are getting deleted from Airflow UI and the Airflow list_dags as well.
code from main dag to create dynamic dag
dag_id = 'hello_world_{}'.format(str(new_entry[0]))
default_args = {'owner': 'airflow',
'start_date': datetime(2018, 1, 1)
}
schedule = ## frequency
dag_number = new_entry
globals()[dag_id] = create_dag(dag_id,
schedule,
dag_number,
default_args)
Is this the expected behavior of dynamic dag or am I missing something while creating them?

For Airflow, a DAG exists only it the corresponding DAG object is found while scanning Python files in the dag directory.
So, your main DAG has to generate DAGs not only for new DB entries, but also for all previous entries too.

Related

In airflow, can I create the pool inside the DAG if it does not exists?

I have a dag that triggers an external DAG using TriggerDagOperator. The trigger DAG queries a database and based on a type ID, it will trigger the associated external DAG along with the parameters needed for the external DAG. I would want to pass a pool name as part of these parameters and just wondering if I can create the pool in the (external) DAG if it does not exists.

Airflow dags lifecycle events

I am trying to manage airflow dags (create, execute etc.) via java backend. Currently after creating a dag and placing it in dags folder of airflow my backend is constantly trying to run the dag. But it can't run it until its picked up by airflow scheduler, which can take quite some time if the number of dags are more. I am wondering if there any events that airflow emits which I can tap to check for new dags processed by scheduler, and then trigger, execute command from my backend. Or is there a way or configuration where airflow will automatically start a dag once it processes it rather than we triggering it ?
is there a way or configuration where airflow will automatically start a dag once it processes it rather than we triggering it ?
Yes, one of the parameters that you can define is is_paused_upon_creation.
If you set your DAG as:
DAG(
dag_id='tutorial',
default_args=default_args,
description='A simple tutorial DAG',
schedule_interval="#daily",
start_date=datetime(2020, 12, 28),
is_paused_upon_creation=False
)
The DAG will start as soon as picked up by the scheduler (assuming conditions to run it are met)
I am wondering if there any events that airflow emits which I can tap to check for new dags processed by scheduler
In Airflow >=2.0.0 you can use the API - list dags endpoint to get all dags that are in the dagbag
In any Airflow version you can use this code to list the dag_ids:
from airflow.models import DagBag
print(DagBag().dag_ids())

Airflow scheduler wont queue some tasks that have been cleared in the UI

I have a bunch of DAG runs in the ui. If I clear some task across all the DAG runs, some of the tasks are correctly triggered, whereas others are stuck with a cleared state.
At the moment I am simply using airflow CLI to backfill these tasks. This works, but it is unfortunate that I need an unbroken CLI session to complete a clearing/reprocessing scenario.
The reason for this is the naming (and thus type) of your DAG runs.
If you go into your airflow meta data db and open table "dag_runs", you will see run_id. The scheduler identifies the runs it creates with "scheduled__" followed by a datetime. If you backfill the run_id will be named "backfil_" followed by a datetime.
The scheduler will only check and queue tasks for run_ids that starts with "scheduled__", denoting a DagRunType of "scheduled".
If you rename the run_id from backfill_ to scheduled__, the scheduler will identify the dag runs and schedule the cleared task underneath.
This SQL query will change bacfill_ to schelduled__:
UPDATE dag_run
SET run_id = Replace(run_id, 'backfill_', 'scheduled__')
where id in (
select id from dag_run where ("run_id"::TEXT LIKE '%backfill_%'));
-- note that backfill_ is a single underscore, and scheduled__ is two.
-- This is not a mistake in my case. But please review the values in your tabel.

Airflow on demand DAG with multiple instances running at the sametime

I am trying to see if I airflow is a good fit for this scenario. At present, I have a DAG. This looks for a trigger file at s3, creates EMR cluster and submit spark job, then delete the EMR cluster.
My requirement is to convert this into on demand run. There will be many users running the export from the application. For each export run, I will have to call this DAG. That means there will be more than once instance of the same DAG will be running at the sametime.
I know we an make an API call to trigger a DAG. But I am not sure if we can run more than once instance of a DAG at the sametime. Can anyone had similar use case?
I am handling this with max_active_runs
dag = DAG(
'dev_clickstream_v1',
max_active_runs=5,
default_args=DEFAULT_ARGS,
dagrun_timeout=timedelta(hours=2),
params=PARAMS
)

Deleting a SubDag from Airflow's database

I created 4 SubDags within the main Dag which will run on different schedule_interval. I removed the operation of one SubDag but it still appears on Airflow's Database. Will that entry in the database execute? Is there a way to delete that from Airflow's database?
The record will persist in the database, however if the DAG isn't actually present on the scheduler (and workers depending on your executor), it can't be added to the DagBag and won't be run.
Having a look at this simplified scheduler of what the scheduler does:
def _do_dags(self, dagbag, dags, tis_out):
"""
Iterates over the dags and schedules and processes them
"""
for dag in dags:
self.logger.debug("Scheduling {}".format(dag.dag_id))
dag = dagbag.get_dag(dag.dag_id)
if not dag:
continue
try:
self.schedule_dag(dag)
self.process_dag(dag, tis_out)
self.manage_slas(dag)
except Exception as e:
self.logger.exception(e)
The scheduler will check if the dag is contained in the DagBag before it does any processing on it. Entries for DAGs are kept in the database to maintain the historical record of what dates have been processed should you re-add it in the future. But for all intents and purposes, you can treat a missing DAG as a paused DAG.

Resources