I am new to Airflow,I want to hide trigger_Dag from web UI to make every one to use cli to trigger dag manually. help me with what all changes need to do it airflow.cfg file.
I think the easiest for you would be to enabled RBAC (role-based access control) and then to restrict the right to run jobs. This way, people wouldn't be able to actually run anything from the UI:
https://airflow.readthedocs.io/en/latest/howto/add-new-role.html
Related
We want to avoid users to manually editing or adding new variables/connections from Airflow GUI in Production. Currently, we are using JSON files which loads all the connections and variables for Airflow.
Can experts guide me on how to achieve this?
Sergiy's right, you will need to start looking at Airflow's Role Based Access (RBAC) documentation and implement from there. Here's a place to start: https://airflow.readthedocs.io/en/latest/howto/add-new-role.html
Airflow version: 1.10.2
Ubuntu: 18.04 (bionic)
Python: 3.6.5
Issue: I am not sure how but the connections are not visible when I click Admin in the menu. Has someone ever faced this thing?
When I edit the URL and go to localhost:8080/admin/connections I see the below response. This was working fine since
But when I list the connections from airflow cli, it works. I am not sure why it is not visible on UI but rather accessible from cli? Or how should I give the UI user access to 'Connections'?
This is due to a change in 10.0.2. Prior to 10.0.2, there was a hardcoded superuser flag for users.
To give an existing user superuser permissions, so that they can manage connections, variables, etc, you need to toggle the superuser flag in the users table in Airflow's metadata database.
They document how to make a user a superuser using code in the UPDATING.md file, see https://github.com/apache/airflow/blob/master/UPDATING.md#user-model-changes
Worked! I put RBAC=True in airflow.cfg and then did airflow initdb.
I am not sure why or how this issue occured but the above solution will make the Connection Ui visible again.
Is there a setting in Cloud Composer / Airflow that can disable new DAGs in the DAGs folder by default, without the need for specifying this in the DAG files themsleves?
I want to be able to load these DAGs in to a development environment where users should just run these DAGs manually rather than them being scheduled.
I had a look here, https://github.com/apache/airflow/blob/master/airflow/config_templates/default_airflow.cfg
but I couldn't find anything obvious.
Yes there is one.
It's called dags_are_paused_at_creation
I want to modify the schedule of a task I created in a dags/ folder through the airflow UI. I can't find a way to modify the schedule through the UI. Can it be done or we can get it done only by modifying the python script ?
The only way to change it is through the code. As it's part of the DAG definition (like tasks and dependencies), it appears to be difficult to be able to change it through the web interface.
Is there any way to reload the jobs without having to restart the server?
In your airflow.cfg, you've these two configurations to control this behavior:
# after how much time a new DAGs should be picked up from the filesystem
min_file_process_interval = 0
dag_dir_list_interval = 60
You might have to reload the web-server, scheduler and workers for your new configuration to take effect.
I had the same question, and didn't see this answer yet. I was able to do it from the command line with the following:
python -c "from airflow.models import DagBag; d = DagBag();"
When the webserver is running, it refreshes dags every 30 seconds or so by default, but this will refresh them in between if necessary.
Dags should be reloaded when you update the associated python file.
If they are not, first try to manually refresh them in UI by clicking the button that looks like a recycle symbol:
If that doesn't work, delete all the .pyc files in the dags folder.
Usually though, when I save the python file the dag gets updated within a few moments.
I'm pretty new to airflow, but I had initially used sample code, which got picked up right away and then edited it to call my own code.
This was ultimately giving an error, but I only found this out once I had deleted the DAG with the example code, on the airflow webserver's UI (the trash button):
Once deleted, it showed me the error which was preventing it from loading the new dag.