When I run manually, sometimes DAG is not getting started. It is showing tasks's staus as 'no status' only. Then I am manually making each task as 'success' and restarting the airflow scheduler, webserver.
Can you please suggest/help me to get this solved?.
Thanks,
Samanth
Related
I have created DAG using AWS Managed airflow. My task is getting stuck at queue status and never run .I need to delete and redeploy the DAG to make it run. If I trigger it again after successful run. I am facing same issue.
Please help me with the issue and let me know how i can check airflow scheduler in aws console
I have the directory for my dag in the airflow/dags directory, and when calling airflow dags list while logged into the webserver, the dag's ID shows up in the list. However, calling airflow dags list while logged into the scheduler returns the following error:
Killed
command terminated with exit code 137
The dag also does not show up in list on the webserver UI. When manually entering the dag_id in the url, it shows up with every task in the right place, but triggering a manual run via the Trigger DAG button results in a pop_up stating Cannot find dag <dag_id>. Has anyone run into this issue before? Is this a memory problem?
My DAG code is written in python, and the resulting DAG object has a large number of tasks (>80).
Running on airflow 1.10.15 with a kubernetes executor
I need to restart Airflow. I want to make sure I do it when it's idle, so I that I don't interrupt a job by restarting the worker component of Airflow.
How do I see what DAGs are running?
I don't see anything in the UI that would list currently running DAGs.
I don't see any command in the airflow CLI to list currently running DAGs.
I found airflow shell that lets me connect to the DB, but I don't know enough about Airflow internals to know where to look to see what's running.
You can also query the database to get all running tasks at once:
select * from task_instance where state='running'
You can use the command line command airflow jobs check which would return "No alive jobs found." in the event no jobs are running.
I found it... it's in the UI, on the DAGs page, it's the second circle under "Recent Tasks":
I have created a dag and that dag is available in the Airflow UI and i turned it on to run it. After running the dag the status is showing it is up for retry. After that i went to the server and used the command "Airflow scheduler" and after that the dag went successful.
Before running the dag the scheduler is up and running and i am not sure why this is happened.
Do we need to run the airflow scheduler when ever we create a new dag ?
Want to know how the scheduler works.
Thanks
You can look at the airflow scheduler as an infinite loop that checks tasks' states on each iteration and triggers tasks whose dependencies have been met.
The entire process generates a bunch of data that piles up more and more on each round and, at some point, it might end up rendering the scheduler useless as its performance degrades over time. This depends on your Airflow version, it seems to be solved in the newest version (2.0), but for older ones (< 2.0) the recommendation was to restart the scheduler every run_duration number of seconds, with some people recommending setting it to once an hour or once a day. So, unless you're working on Airflow 2.0, I think this is what you're experiencing.
You can find references to this scheduler-restarting issue in posts made by Astronomer here and here.
I am using Google Cloud Composer for Apache Airflow. Trying to mark success to one of the failed job, But it is giving "Exception: Ooops" error.
When I am trying from UI, DAG page. It is not giving any instance of job which needs to be marked as success. But when I see from ( browse >> Tasks Instances ), I can see Failed tasks.
No tasks being listed down when trying to mark the failed job as success
And when I go to ( browse >> Tasks Instances ), Select the failed instances, and try to (With Selected >> Set state to success). It is giving Oops Exception.
How to mark the failed job as success in Airflow, So predecessor can be triggered?