I am very new to Airflow, I have set-up everything according to what are stated on their website. However I find it very confusing to figure out my dag folder location. NOTE: I configure **airflow.cfg (/airflow/dags) within this folder has two files.
/airflow/dags
---dag1.py
---dag2.py
But when I try to do airflow list_dags, it still shows the dags inside example_dags folder on
usr/local/lib/python2.7/dist_packages/airflow/example_dags
How can I see the path when I do airflow list_dags and to change it ? Any helps is appreciated.
There is an airflow.cfg value under the [core] section called load_examples that you can set to false to exclude the example DAGs. I think that should clean up the output you’re seeing from list_dags.
Related
I am running airflow 2.0, setting up airflow dag for the first time, and following quick start tutorials.
After creating and running the py file I don't see the dag created it does not list for me.
setting:
airflow.cfg:dags_folder = /Users/vik/src/airflow/dags
my python file is in this folder. There are no errors here.
I am able to see the example dags in example-dags.
I did airflow db init
airflow webserver
airflow scheduler
then try to list the dags
I think I am missing something
I don't know exactly how you installed everything, but I highly recommend Astronomer CLI for simplicity and quick setup. With that you'll be able to setup a first DAG pretty quickly. Here is also the video tutorial that helps you understand how to install / setup everything.
A few things to try:
Make sure the scheduleris running (run airflow scheduler) or try to restart it .
Using the Airflow CLI, run airflow config list and make sure that the loaded config is in fact what you are expecting, check the value of dags_folder.
Try running airflow dags list from the CLI, and check the the filepath if your dag is shown in the results.
If there was an error parsing your DAG, and therefore could not be loaded by the scheduler, you can find the logs in ${AIRFLOW_HOME}/logs/scheduler/${date}/your_dag_id.log
I placed a dag file in the dags folder based on a tutorial with slight modifications, but it doesn't show up in the GUI or when run airflow dags list.
Answering my own question: Check the python file for Exceptions by running it directly. It turns out one exception in the dag's python script due to a missing import made the dag not show up in the list. I note this just in case another new user comes across this. To me the moral of the story is that dag files should often be checked by running with python directly when they are modified because there won't be an obvious error showing up otherwise; they may just disappear from the list.
Currently i am using Airflow with Version : 1.10.10
After opening into airflow/logs folder there are many folder that are named based on your DAG name but there is a folder named scheduler which when opened consist folder that are named in date format ( E.g 2020/07/08 ) and it goes until the date when i first using airflow.After searching through multiple forum I'm still not sure what this folder logs are for.
Anyway the probelm is I kept wondering if it is okay to delete the contents inside scheduler folder since it takes so much space unlike the rest of the folder that are named based on the DAG name (I'm assuming thats where the log of each DAG runs is stored). Will the action of deleting the contents of scheduler cause any error or loss of DAG log?.
This might be a silly question but i want to make sure since the Airflow is in production server. I've tried creating an Airflow instance in local instance and delete the scheduler folder contents and it seems no error have occurred. Any feedback and sharing experience on handling this issue is welcomed
Thanks in Advance
It contains the logs of airflow scheduler afaik. I have used it only one time for a problem about SLAs.
I've been deleting old files in it for over a year, never encountered a problem. this is my command to delete old log files of scheduler:
find /etc/airflow/logs/scheduler -type f -mtime +45 -delete
Is there a setting in Cloud Composer / Airflow that can disable new DAGs in the DAGs folder by default, without the need for specifying this in the DAG files themsleves?
I want to be able to load these DAGs in to a development environment where users should just run these DAGs manually rather than them being scheduled.
I had a look here, https://github.com/apache/airflow/blob/master/airflow/config_templates/default_airflow.cfg
but I couldn't find anything obvious.
Yes there is one.
It's called dags_are_paused_at_creation
I'm having trouble updating a dag file. Dag still have an old version of my dag file. I added a task but it seems not updated when I check the log and UI (DAG->Code).
I have very simple tasks.
I of course checked the dag directory path in airflow.cfg and restarted airflow webserver/scheduler.
I have no issue running it (but with the old dag file).
Looks like a bug of airflow. A temp solution is to delete the task instances from airflow db by
delete from task_instance where dag_id=<dag_name> and task_id=<deleted_task_name>';
This should be simpler and less impactful than the resetdb route which would delete everything including variables and connections set before.
Use terminal and run the below command soon after changing the dag file.
airflow initdb
This worked for me.
You can try to remove the old .pyc file for that dag in the dags folder and generate it again.
UI sometimes is not up to date to me, but the code is actually there in dag bag. You can try to:
Use refresh button to see if code refreshed
Use higher version 1.8+, this happens to me before when I used 1.7.X, but after 1.8+, it seems much better after you refresh dag in UI
You can also use "airflow test" to see if the code is in place, and try the advice from #Him as well.
Same thing happened to me.
In the end the best thing is to "resetdb", add connections and import variables again and then airflow initdb and set the scheduler back again.
I don't know why this happens. Anybody knows? It seems not so easy to add tasks or change names once compiled. Removing *.pyc or logs folder did not work for me.
In DAG page of Airflow webserver, delete the DAG. It will delete the record in the database. After a while the DAG will appear again in the page, but the old task_id is removed.