Email alerts received even after deletion of Airflow DAG - airflow

I had created a SCHEDULED DAG in AIRFLOW which used to give email alert when whenever it exceeds a particular time threshold.But now I deleted the DAG py file and also deleted it from the AIRFLOW UI as I don't want it now.Still I am getting email alerts everyday.Can anyone help with this.

Related

Airflow 2.2.4 manually triggered DAG stuck in 'queued' status

I have two DAGs in my airflow scheduler, which were working in the past. After needing to rebuild the docker containers running airflow, they are now stuck in queued. DAGs in my case are triggered via the REST API, so no actual scheduling is involved.
Since there are quite a few similar posts, I ran through the checklist of this answer from a similar question:
Do you have the airflow scheduler running?
Yes!
Do you have the airflow webserver running?
Yes!
Have you checked that all DAGs you want to run are set to On in the web ui?
Yes, both DAGS are shown in the WebUI and no errors are displayed.
Do all the DAGs you want to run have a start date which is in the past?
Yes, the constructor of both DAGs looks as follows:
dag = DAG(
dag_id='image_object_detection_dag',
default_args=args,
schedule_interval=None,
start_date=days_ago(2),
tags=['helloworld'],
)
Do all the DAGs you want to run have a proper schedule which is shown in the web ui?
No, I trigger my DAGs manually via the REST API.
If nothing else works, you can use the web ui to click on the dag, then on Graph View. Now select the first task and click on Task Instance. In the paragraph Task Instance Details you will see why a DAG is waiting or not running.
Here is the output of what this paragraph is showing me:
What is the best way to find the reason, why the tasks won't exit the queued state and run?
EDIT:
Out of curiousity I tried to trigger the DAG from within the WebUI and now both Runs executed (the one triggered from the WebUI failed, but that was expected, since there was no config set)

Send an alert when a dag did not run google cloud

I have a DAG in Airflow where the run is not scheduled, but triggered by an event. I would like to send an alert when the DAG did not run in the last 24 hours. My problem is I am not really sure which tool is the best for the task.
I tried to solve it with the Logs Explorer, I was able to write a quite good query filtering by the textPayload, but it seems that tool is designed to send the alert when a specific log is there, not when it is missing. (Maybe I missed something?)
I also checked Monitoring where I could set up an Alert when logs are missing, however in this case I was not able to write any query where I can filter logs by textPayload.
Thank you in advance if you can help me!
You could set up a separate alert DAG that notifies you if other DAGs haven't run in a specified amount of time? To get the last runtime of a DAG, use something like this:
from airflow.models import DagRun
dag_runs = DagRun.find(dag_id=dag_id)
dag_runs.sort(key=lambda x: x.execution_date, reverse=True)
Then you can use dag_runs[0] and compare with the current server time. If the date difference is greater than 24h, raise an alert.
I was able to do it in the monitoring. I did not need the filtering query which I used in the Logs Explorer. I needed to create an Alerting Policy, filtered by workflow_name, task_name and location. In the configure trigger section I was able to choose "Metric absence" with a 1 day absence time, so I resolved my old query with this.
Of course, it could be solved with setting up a new DAG, but setting up an Alerting Policy seems more easier.

Airflow DAG exists in dag_list, but cannot be triggered and does not show up in UI

I have the directory for my dag in the airflow/dags directory, and when calling airflow dags list while logged into the webserver, the dag's ID shows up in the list. However, calling airflow dags list while logged into the scheduler returns the following error:
Killed
command terminated with exit code 137
The dag also does not show up in list on the webserver UI. When manually entering the dag_id in the url, it shows up with every task in the right place, but triggering a manual run via the Trigger DAG button results in a pop_up stating Cannot find dag <dag_id>. Has anyone run into this issue before? Is this a memory problem?
My DAG code is written in python, and the resulting DAG object has a large number of tasks (>80).
Running on airflow 1.10.15 with a kubernetes executor

Datadog airflow alert when a dag is paused

We have encountered a scenario recently where someone mistakenly turned off a production dag, and we want to get alert whenever a dag is paused using datadog.
I have checked https://docs.datadoghq.com/integrations/airflow/?tab=host
But have not got any metric for dag to check if it is paused or not.
I can run a custom script in datadog as well.
One of the method is that I exec into postgres pod and get the list of active dags:
select * from dag where is_paused=true;
Or is there any other way I can get the unpaused dag list and also when new dag is added what is the best way to handle it.
I want the alert whenever a unpaused dag is paused.
If you are on Airflow 2 you can use the REST API to query for state of the DAG.
https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/get_dag
There is "is_paused" field.
And of you are not Airflow 2, you should be. Airflow 1.10 is end-of-life and will not receive any fixes (including critical security fixes) so you should upgrade as soon as you can.

How to export lastest airflow dag and task status in tabular format for specific owners

Airflow provides rest API functionality to extracts dag/task status.
https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#section/Trying-the-API
But wondering if there a way to get latest dag/task status of all dags w.r.t dag owner only without specifying it manually for each dag id.
This will help us for creating a workflow dashboard for business users.
You can make use of dag, dag_run and task_instsnce tables in airflow metadata database. It's fairly straightforward..

Resources