I would like to set the configuration of my symfony2 project using environment variables.
In the server I have defined:
SYMFONY__DATABASE__USER
SYMFONY__DATABASE__PASSWORD
SYMFONY__DATABASE__NAME
SYMFONY__DATABASE__HOST
SYMFONY__DATABASE__DRIVER
My parameters.yml.dist looks like this:
#app/config/parameters.yml.dist
parameters:
database_host: "%database.host%"
database_port: ~
database_name: "%database.name%"
database_user: "%database.user%"
database_password: "%database.password%"
database_driver: "%database.driver%"
when I run composer I get an exception
composer install --dev --no-interaction --prefer-source
[Symfony\Component\DependencyInjection\Exception\ParameterNotFoundException]
You have requested a non-existent parameter "database.driver". Did you mean one of these: "database_user", "database_driver"?
These variables are defined in the server so I can modify the parameters.yml.dist to define these values. But this does not seams the right way, because wat I really want to use are the environment variables.
Note: I want to read this environment variables in travis, heroku and my vagrant machine. I only want to have in the repository the vagrant machine variables.
Which is the proper way to do this?
How should look my parameters.yml.dist?
Looks you are doing everything okay.
Here is the complete documentation for Setting Environment Variables which I believe you already read.
What is important to note is this:
Also, in order for your console to work (which does not use Apache),
you must export these as shell variables. On a Unix system, you can
run the following:
$ export SYMFONY__DATABASE__USER=user
$ export SYMFONY__DATABASE__PASSWORD=secret
I remember once I have a similar issue, I was setting everything on APACHE, but when running commands it wasn't working because I forgot to EXPORT the variables on the system.
Be aware that using export is a temp solution, if you reset your server those values will be lost, you will need to setup in a permanent way according to your OS.
I think you solved this long time ago, but the problem is actually that you have 2 _ between DATABASE and USER and the parser for this have a string replace function that replaces every __ with a . .
For your example to work you should have written like this:
SYMFONY__DATABASE_USER -> database_user
SYMFONY__DATABASE__USER -> database.user
You can try this bundle if your system version is >= 2.6.2:
This bundle provides a way to read parameters from environment
variables at runtime. The value defined in the container parameter is
used as fallback when the environment variable is not available.
Related
I'm using dagster 0.11.3 (the latest as of this writing)
I've created a Dagster pipeline (saved as pipeline.py) that looks like this:
#solid
def return_a(context):
return 12.34
#pipeline(
mode_defs=[
ModeDefinition(
executor_defs=[dask_executor] # Note: dask only!
)
]
)
def the_pipeline():
return_a()
I have the DAGSTER_HOME environment variable set to a directory that contains a file named dagster.yaml, which is an empty file. This should be ok because the defaults are reasonable based on these docs: https://docs.dagster.io/deployment/dagster-instance.
I have an existing Dask cluster running at "scheduler:8786". Based on these docs: https://docs.dagster.io/deployment/custom-infra/dask, I created a run config named config.yaml that looks like this:
execution:
dask:
config:
cluster:
existing:
address: "scheduler:8786"
I have SUCCESSFULLY used this run config with Dagster like so:
$ dagster pipeline execute -f pipeline.py -c config.yaml
(I checked the Dask logs and made sure that it did indeed run on my Dask cluster)
My question is: How can I get Dagit to use this Dask cluster?
The only thing I have found that seems related is this:
https://docs.dagster.io/_apidocs/execution#executors
...but it doesn't even mention Dask as an option (it has dagster.in_process_executor and dagster.multiprocess_executor, which don't seem at all related to dask).
Probably I need to configure dagster-dask, which is documented here: https://docs.dagster.io/_apidocs/libraries/dagster-dask#dask-dagster-dask
...but where do I put that run config when using Dagit? There's no way to feed config.yaml to Dagit, for example.
Some options:
you can manually plug in the values that are in config.yaml in to the dagit playground
you can bind the config directly to the executor if you do not need to change it ever https://docs.dagster.io/concepts/configuration/configured#configured-api
you can create a preset from that config yaml https://docs.dagster.io/tutorial/advanced-tutorial/pipelines#pipeline-config-presets
Given the context, I would recommend the configured API
I'm using hydra to log hyperparameters of experiments.
#hydra.main(config_name="config", config_path="../conf")
def evaluate_experiment(cfg: DictConfig) -> None:
print(OmegaConf.to_yaml(cfg))
...
Sometimes I want to do a dry run to check something. For this I don't need any saved parameters, so I'm wondering how I can disable the savings to the filesystem completely in this case?
The answer from Omry Yadan works well if you want to solve this using the CLI. However, you can also add these flags to your config file such that you don't have to type them every time you run your script. If you want to go this route, make sure you add the following items in your root config file:
defaults:
- _self_
- override hydra/hydra_logging: disabled
- override hydra/job_logging: disabled
hydra:
output_subdir: null
run:
dir: .
There is an enhancement request aimed at Hydra 1.1 to support disabling working directory management.
Working directory management is doing many things:
Creating a working directory for the run
Changing the working directory to the created dir.
There are other related features:
Saving log files
Saving files like config.yaml and hydra.yaml into .hydra in the working directory.
Different features has different ways to disable them:
To prevent the creation of a working directory, you can override hydra.run.dir to ..
To prevent saving the files into .hydra, override hydra.output_subdir to null.
To prevent the creation of logging files, you can disable logging output of hydra/hydra_logging and hydra/job_logging, see this.
A complete example might look like:
$ python foo.py hydra.run.dir=. hydra.output_subdir=null hydra/job_logging=disabled hydra/hydra_logging=disabled
Note that as always you can also override those config values through your config file.
My .env file has the following entries, but FOO is not listed when I run bin/console debug:container --env-vars. Note however that $_ENV['FOO'] exists when I dump the variable.
FOO=1
AUTH0_CLIENT_ID=clientid
AUTH0_CLIENT_SECRET=secret
AUTH0_DOMAIN=myapp.us.auth0.com
What determines if an env var defined in .env will be available in the container?
Not really sure if this is worthy of an answer but I suppose it might help.
The Symfony .env files are really just one possible sources of $_ENV variables. There are lots of other env variables floating around and of course in production, you might not use .env at all.
So rather than save access to all env variables, the Symfony configuration system only saves those that are actually used. So in this case:
# config/services.yaml
parameters:
foo: '%env(resolve:FOO)%'
Will result in:
bin/console debug:container --env-vars
APP_SECRET n/a "84dc6de50e6f2f7af3db3f78f886840f"
DATABASE_URL n/a "mysql://db_user:db_password#127.0.0.1:3306/db_name?serverVersion=5.7"
FOO n/a "1"
MAILER_DSN n/a n/a
VAR_DUMPER_SERVER "127.0.0.1:9912" n/a
For a fresh 5.1 project.
Off-topic but vaguely interesting to me at least, the above command also generates a warning
[WARNING] The following variables are missing:
* MAILER_DSN
MAILER_DSN is commented out in the default .env file. So I guess it is possible to use env values during configuration even if none are defined at compile time. Good way to check for spelling errors I guess.
Is there a way to encrypt the airflow config file sql_alchemy_conn string , the password shown in example is plaintext . What options are there to secure it. Also if the password has special chars how it must be escaped in the config file
Trying to install airflow using airflow role.
# See: https://www.sqlalchemy.org/
sql_alchemy_conn:
value: "postgresql+psycopg2://pgclusteradm#servername:PLAINTEXTPASSWORD#server.postgres.database.azure.com/airflow2"
Way to encrypt password, couldn't find how to encrypt this.
You can provide the database URI through environment variables instead of the config file. This doesn't encrypt it or necessarily make it more secure, but it at least isn't plainly sitting in a permanent file.
In your airflow.cfg you can put a placeholder:
[core]
...
sql_alchemy_conn = override_me
...
Then set AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://... in an environment variable when you bring up Airflow components. This way of setting and overriding configuration options through environment variables is detailed in the docs, but the basic format is AIRFLOW__{SECTION}__{KEY}=<value>.
There are 2 ways of securing this as mentioned in docs:
1) Environment Variable:
You can override the setting in airflow.cfg by setting the following environment variable:
AIRFLOW__CORE__SQL_ALCHEMY_CONN=my_conn_string
This way you can keep the setting in airflow.cfg as empty so no one can view the password.
2) Get string by running command:
You can also derive the connection string at run time by appending _cmd to the key like this:
[core]
sql_alchemy_conn_cmd = bash_command_to_run
I'm using Symfony 2 and have this row in my parameters.ini:
database_driver = pdo_pgsql
When I was creating database structure with Doctrine everything was good. But if I want to add some doctrine object to my darabase (insert row), I catch an exception:
What I have to do with this?
Are you sure you're using pdo_pgsql? Are you running on localhost? It might be very certain that you are using pdo_mysql driver instead.
However you have to check the following:
php.ini
extension=pdo.so
extension=pdo_mysql.so
or in your case
extension=pdo.so
extension=pdo_pgsql.so
You can check the phpinfo(); to find out the configured database driver.
In your symfony project you have to check the parameters.ini file in config folder. E.g.
[parameters]
database_driver="pdo_mysql"
database_host="localhost"
Besides try to avoid this error
'stty' is not recognized as an internal or external command,
operable program or batch file.
https://github.com/symfony/symfony/issues/4974
First of all, verify your php.ini file: the extensions php_pdo_pgsql and php_pdo must be enabled. Make sure you apply this changes on php.ini file that your symfony project is using, check this on localhost/path_to_your_project/web/config.php. You know if this extensions are enabled executing the function phpinfo().
This command is also helpfull: php -m. It lists on console all the php modules that are loaded.
Tip: check out you Apache error log, there could be something wrong with the load of your extensions. This file is located according to your server configuration.