I'm running a distributed Airflow 2.4.0 setup using the official Docker image. All the containers use the same .env file and same version of Airflow image. When I log into one of the Airflow containers I get this warning:
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:545: DeprecationWarning: The sql_alchemy_conn option in [core] has been moved to the sql_alchemy_conn option in [database] - the old setting has been used, but please update your config.
option = self._get_environment_variables(deprecated_key, deprecated_section, key, section)
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:545: DeprecationWarning: The auth_backend option in [api] has been renamed to auth_backends - the old setting has been used, but please update your config.
option = self._get_environment_variables(deprecated_key, deprecated_section, key, section)
/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:367: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
FutureWarning,
I checked the airflow.cfg inside the container and it has the up to date variables. Why do I still get the warning messages?
You are seeing these variables because of the section. airflow.cfg is configuration file with section. settings are expected to be in the proper section.
In your case your airflow.cfg has sql_alchemy_conn where you override the default value. Prior to 2.3.0 this setting in core section and in 2.3.0 it was moved to database section. (see PR)
What you need to do is simply open airflow.cfg and move the setting to the proper section. For example:
[core]
sql_alchemy_conn = sqlite:///{AIRFLOW_HOME}/airflow.db
to:
[database]
sql_alchemy_conn = sqlite:///{AIRFLOW_HOME}/airflow.db
The reason why it's like that is also explained in the docs. Airflow reference settings by environment variables of the format AIRFLOW__{SECTION}__{KEY} so in this case it will be:
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN so the section is important to find the variable.
Related
i would like cloud-init to not use 50-cloud-init.yaml. I have prepared my own file.
Do you know how to do this
You can add
network:
config: disabled
to /etc/cloud/cloud.cfg or a file in /etc/cloud/cloud.cfg.d.
Another option is to add
network-config=disabled
to the kernel command line.
While the network config yaml technically works as userdata, the network configuration will have already been written out before userdata is read.
One other option is to write your netplan configuration into /etc/netplan/99-some-name.yaml. If you have configuration there that overlaps with what is in 50-cloud-init.yaml, your configuration will override what is in the default configuration.
See https://cloudinit.readthedocs.io/en/latest/topics/network-config.html#disabling-network-configuration .
Is there a way to encrypt the airflow config file sql_alchemy_conn string , the password shown in example is plaintext . What options are there to secure it. Also if the password has special chars how it must be escaped in the config file
Trying to install airflow using airflow role.
# See: https://www.sqlalchemy.org/
sql_alchemy_conn:
value: "postgresql+psycopg2://pgclusteradm#servername:PLAINTEXTPASSWORD#server.postgres.database.azure.com/airflow2"
Way to encrypt password, couldn't find how to encrypt this.
You can provide the database URI through environment variables instead of the config file. This doesn't encrypt it or necessarily make it more secure, but it at least isn't plainly sitting in a permanent file.
In your airflow.cfg you can put a placeholder:
[core]
...
sql_alchemy_conn = override_me
...
Then set AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://... in an environment variable when you bring up Airflow components. This way of setting and overriding configuration options through environment variables is detailed in the docs, but the basic format is AIRFLOW__{SECTION}__{KEY}=<value>.
There are 2 ways of securing this as mentioned in docs:
1) Environment Variable:
You can override the setting in airflow.cfg by setting the following environment variable:
AIRFLOW__CORE__SQL_ALCHEMY_CONN=my_conn_string
This way you can keep the setting in airflow.cfg as empty so no one can view the password.
2) Get string by running command:
You can also derive the connection string at run time by appending _cmd to the key like this:
[core]
sql_alchemy_conn_cmd = bash_command_to_run
Airflow creates a unittest.cfg file in the AIRFLOW_HOME environment variable path.
My question is: how can I point to unittest.cfg in the same why that I point to the airflow.cfg via the environment variable AIRFLOW_CONFIG?
The reason why I want to do this is because I don't want to have any config files in the AIRFLOW_HOME directory.
Also, if anyone knows better, could you please explain what is the unittest.cfg is for as there is no documentation I could find on it.
unittest.cfg test configuration file is the default configuration file used when Airflow is running in test mode.
Test mode can be activated by setting the unit_test_mode configuration option in airflow.cfg or AIRFLOW__CORE__UNIT_TEST_MODE environment variable to True .
The configuration values in test configuration file overwrite those in airflow.cfg in runtime when test mode is activated.
# Source: https://github.com/apache/airflow/blob/1.10.5/airflow/configuration.py#L558,L561
def get_airflow_test_config(airflow_home):
if 'AIRFLOW_TEST_CONFIG' not in os.environ:
return os.path.join(airflow_home, 'unittests.cfg')
return expand_env_var(os.environ['AIRFLOW_TEST_CONFIG'])
The AIRFLOW_TEST_CONFIG environment variable can be set to the path of your test configuration file.
I try to configure Airbnb AirFlow to use the CeleryExecutor like this:
I changed the executer in the airflow.cfg from SequentialExecutor to CeleryExecutor:
# The executor class that airflow should use. Choices include
# SequentialExecutor, LocalExecutor, CeleryExecutor
executor = CeleryExecutor
But I get the following error:
airflow.configuration.AirflowConfigException: error: cannot use sqlite with the CeleryExecutor
Note that the sql_alchemy_conn is configured like this:
sql_alchemy_conn = sqlite:////root/airflow/airflow.db
I looked at Airflow's GIT (https://github.com/airbnb/airflow/blob/master/airflow/configuration.py)
and found that the following code throws this exception:
def _validate(self):
if (
self.get("core", "executor") != 'SequentialExecutor' and
"sqlite" in self.get('core', 'sql_alchemy_conn')):
raise AirflowConfigException("error: cannot use sqlite with the {}".
format(self.get('core', 'executor')))
It seems from this validate method that the sql_alchemy_conn cannot contain sqlite.
Do you have any idea how to configure the CeleryExecutor without sqllite? please note that I downloaded rabitMQ for working with the CeleryExecuter as required.
It is said by AirFlow that the CeleryExecutor requires other backend than default database SQLite. You have to use MySQL or PostgreSQL, for example.
The sql_alchemy_conn in airflow.cfg must be changed to follow the SqlAlchemy connection string structure (see SqlAlchemy document)
For example,
sql_alchemy_conn = postgresql+psycopg2://airflow:airflow#127.0.0.1:5432/airflow
To configure Airflow for mysql
firstly install mysql this might help or just google it
goto airflow installation director usually /home//airflow
edit airflow.cfg
locate
sql_alchemy_conn = sqlite:////home/vipul/airflow/airflow.db
and add # in front of it so it looks like
#sql_alchemy_conn = sqlite:////home/vipul/airflow/airflow.db
if you have default sqlite
add this line below
sql_alchemy_conn = mysql://:#localhost:3306/
save the file
run command
airflow initdb
and done !
As other answers have stated you need to use a different database besides SQLite. Additionally you need to install rabbitmq, configure it appropriately, and change each of your airflow.cfg's to have the correct rabbitmq information. For an excellent tutorial on this see A Guide On How To Build An Airflow Server/Cluster.
If you run it on a kubernetes cluster. Use the following config:
airflow:
config:
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://postgres:airflow#airflow-postgresql:5432/airflow
I would like to set the configuration of my symfony2 project using environment variables.
In the server I have defined:
SYMFONY__DATABASE__USER
SYMFONY__DATABASE__PASSWORD
SYMFONY__DATABASE__NAME
SYMFONY__DATABASE__HOST
SYMFONY__DATABASE__DRIVER
My parameters.yml.dist looks like this:
#app/config/parameters.yml.dist
parameters:
database_host: "%database.host%"
database_port: ~
database_name: "%database.name%"
database_user: "%database.user%"
database_password: "%database.password%"
database_driver: "%database.driver%"
when I run composer I get an exception
composer install --dev --no-interaction --prefer-source
[Symfony\Component\DependencyInjection\Exception\ParameterNotFoundException]
You have requested a non-existent parameter "database.driver". Did you mean one of these: "database_user", "database_driver"?
These variables are defined in the server so I can modify the parameters.yml.dist to define these values. But this does not seams the right way, because wat I really want to use are the environment variables.
Note: I want to read this environment variables in travis, heroku and my vagrant machine. I only want to have in the repository the vagrant machine variables.
Which is the proper way to do this?
How should look my parameters.yml.dist?
Looks you are doing everything okay.
Here is the complete documentation for Setting Environment Variables which I believe you already read.
What is important to note is this:
Also, in order for your console to work (which does not use Apache),
you must export these as shell variables. On a Unix system, you can
run the following:
$ export SYMFONY__DATABASE__USER=user
$ export SYMFONY__DATABASE__PASSWORD=secret
I remember once I have a similar issue, I was setting everything on APACHE, but when running commands it wasn't working because I forgot to EXPORT the variables on the system.
Be aware that using export is a temp solution, if you reset your server those values will be lost, you will need to setup in a permanent way according to your OS.
I think you solved this long time ago, but the problem is actually that you have 2 _ between DATABASE and USER and the parser for this have a string replace function that replaces every __ with a . .
For your example to work you should have written like this:
SYMFONY__DATABASE_USER -> database_user
SYMFONY__DATABASE__USER -> database.user
You can try this bundle if your system version is >= 2.6.2:
This bundle provides a way to read parameters from environment
variables at runtime. The value defined in the container parameter is
used as fallback when the environment variable is not available.