I'm trying to use Ansible Galaxy collections and can't find a way how to install dependencies of roles inside a collection which I specified in a role's meta/main.yml dependencies list.
As I understand a galaxy.yml of a collection can have only the other collections as dependencies, but not roles.
What is the correct way to install a role's dependencies along with the installation of a collection?
Of course, it is possible to specify all dependencies in the readme file of a collection and use them in the requirements.yml, but it is not very convenient.
I am not sure this can be done.
I ran into the same issue today - i am converting a group of roles into a collection, and some of those roles call on public roles, which is easy for a role. but it seems like a collection can only pull other collections, and the items in the meta/main.yml only seem to agree with roles already in the collection, and there is no way i found to add an external role to a collection.
i ran ansible-galaxy collection install .... -vvvvv and it seems like it is calling on galaxy's v2 API, but looking a little into the API server responses, it seems like it knows only about collections. v1 knows roles too, but...
so, yes - i'm interested in pulling external roles into my collection, and it does not seem to be possible. for now, at least.
Didn't find the solution yet. While looking for a solution it is required to enlist all dependencies in a playbook requirements file
Related
In one Symfony bundle I define a compiler pass to preprocess some configuration. Part of that config is based on Doctrine entities, so I need to get the full metadata information for all application entities.
The compiler pass is executed very late (PassConfig::TYPE_BEFORE_REMOVING). I'm using $container->get('doctrine') like this to get the entity metadata:
$em = $container->get('doctrine')->getManagerForClass($entityClass);
$entityMetadata = $em->getMetadataFactory()->getMetadataFor($entityClass);
However, this is causing random failures for some users because of the use of the doctrine service during the Symfony container compilation.
I'd recommend to change your entities addressing. Mainly - create your models with interfaces and make entities implementing them.
Using resolve_target_entities Doctrine will "convert" them to the particular classes.
An example code is here: https://github.com/Sylius/SyliusResourceBundle/blob/master/DependencyInjection/Compiler/DoctrineTargetEntitiesResolverPass.php
Just make sure your bundle is registered before DoctrineBundle is registered.
Then - in your whole app - instead of AppBundle::Entity addressing, use FQDN of interface bound to an entity earlier.
I've experimented a bit with compilers and services and it's a very bad idea to base on cross-bundle services under compiling container process... Why? It's not reliable - sometimes it will work as you want, sometimes it will fail as you described.
Thank you all for your comments and ideas. I post an answer to explain how I solved this problem.
Trying to use the Doctrine service in a compiler pass was creating more and more problems for our users (and it was creating other minor issues with other services such as Twig). So this was definitely a bad solution for our needs.
So at the end I decided to change everything. We no longer use a compiler pass to process the configuration and instead we use a regular PHP class called during runtime.
In the dev environment, the configuration is processed for each request. It's a bit slower than before, but we prevent any caching issue. In the prod environment we use Doctrine Cache to process the configuration once. Besides, we've create a cache warmer to create the cached configuration before the first request hits the application.
I have a bundle with entity defined in it. I want to be able to configure this bundle in such a way, that this entity will or won't be relevant. So if bundle is configured properly entity table shouldn't be created with app/console doctrine:schema:update etc, or should be - it should depend on configuration.
How to conditionally "disable" entity so its table won't be created by app/console doctrine:schema:update?
Your scenario requires you to disable the auto_mapping, but it seems to be set to false by default. http://symfony.com/doc/current/reference/configuration/doctrine.html
Next thing to do is make sure the build function of your bundle conditionally adds the wanted DoctrineOrmMappingPass as also is explained here: https://stackoverflow.com/a/26975083/1794894
As you can see in the source, build only is executed once the cache is empty so this is the place where you can do this. You can also take a look at how to add compiler passes there.
I think that although maybe you could find a way, you are complicating your self. If the back-end bundle is independent then always could be optional to install it and by consequence it's entities created or not.
You can find an example in Sonata bundles, you can manage the users as you want, but if you are using FOSUserBundle, the you have the option to install SonataUserBundle, then tell to fos_user configuration that the new class belong to the Sonata User and as consequence the new entity will be persisted with a lot of new attributes thanks to class inheritance, and all the crud operations for user will be already configured in sonata views. SonataUser also have it's own user entity for using in a standalone way.
I know that this is not what you asking for but may be you just need manage to follow a model like this.
I am using the sonata-project/doctrine-phpcr-admin bundle in my symfony cmf app, and need to call an external library in the postPresist action which requires the phpcr document manager.
So my questions is, is there a way to retrieve the phpcr-odm document manager (type Doctrine\ODM\PHPCR\DocumentManager) within a sonata admin class (type Sonata\AdminBundle\Admin\Admin)??
Any info will be greatly appreciated.
Your admins are services and they have a constructor, so you are free to add your own things to the constructor and inject.
In the case of the document manager, you should however use what is already provided - this is the most clear as then you know you get the correct manager in case of having configured more than one. There is Admin::getModelManager() that will give you a Sonata\DoctrinePHPCRAdminBundle\Model\ModelManager and on that you can call getDocumentManager to get the document manager.
I think I've a good understanding of Symfony and how bundle works.
However I've never found how to solve a simple problem: make a reusable bundle that provides data like tables/Doctrine entities pre-filled with (i.e.) all country names in the world, all provinces of Italy, tax rates history in England and so on.
Of course the purpose is to provide forms, services and controllers relying on this data source, without the need to copy and paste tables and entities across projects.
How would you do that?
Data fixtures IMHO are not an option because an obvious reason: you are going to purge your database while it's running.
A custom command reading from a static data-source (json, YAML) and performing inserts/updates?
First step is declaring a Doctrine entity in your Bundle. I think you should create DataFixtures to populate your datas into db.
You maybe should consider to use Seeds instead of Fixtures.
Fixtures are fake datas, used to test your application
Seeds are the minimal datas required for your application to work.
Technically, these are exactly the same thing, you declare it under the "DataFixtures/" folder and you import them with the "doctrine:fixtures:load" command.
You can create a folder "Fixtures/", and a folder "Seeds/" under the folder "DataFixtures", then load your seeds with the command
php app/console doctrine:fixtures:load --fixtures=/path/to/seeds/folder --append
It was suggested in the comments that it may be safer, especially in production environment, to create a custom Symfony2 command to force the "--append" mode. Without this mode, your database will be purged, and you could loose your production data.
This answer assumes you're using composer to install your bundles (and you really exclude fixtures as an option).
What you can do, is make an SQL export of the data you want, and make sure it uses INSERT IGNORE INTO, and get the correct unique constraints.
Then you save that file somewhere in your bundle, in a "data" or "fixtures" folder.
so your path to that file will be like:
"vendor/company/epicbundle/data/countries.sql"
What you then can do, is add post-insert and post-update commands in your composer.json, that looks like this:
"post-install-cmd": [
"php app/console doctrine:query:sql \"$(cat vendor/company/epicbundle/data/countries.sql)\""
]
If you only want it to run on install, you only add it there, if you sometimes update the sql file, you also add it to the post-update-cmd.
Please note that this solution only works if people don't temper with the table names, otherwise the queries will fail.
If you want a more save/stable solution, you can write your own post-install script in Symfony that uses the entity manager, and there you can use, for example, a csv file, and insert/update it row by row.
Basically, anything you could implement would surely rely on persistence mechanisms used in your ORM/ODM/whatever. So, you'll end up implementing a typical fixture loading mechanism, at least partially: you'd execute code that saves some provided data; if it's serialized you'd do XML/JSON/YAML parsing (but this is just a technicality) and persist the results into the database.
Thus, it's not bad to stick with Doctrine Fixtures. They are programmable and extensible (you can even fetch your data from the web upon loading).
As stated in #paul-andrieux's answer, if you are worried about data loss (e.g. your bundle's seeds are loaded when the end user's DB is already up), you should use doctrine:fixtures:load --append and let the constraints do their job (like, in a country names table you'd have a unique constraint on country name or even a 'slug') so that inserting duplicate rows silently fails inserting a single entity, in case if your bundle has updated the country list, and the end user had a previous version.
If you really worry about your end users' data you could write a wrapper for the doctrine:fixtures:load command that would have the --append flag always on and register it as a separate command. (You could run needed migrations there, too)
#lxg's hard-coded IDs problem is solvable, too. Try using natural keys where applicable (e.g. the countries table would have a slug primary key that would be great-britain for Grean Britain). This way your searches would be pretty easy: $em->find('\MyBundle\Country', 'great-britain');. If you cannot come up with a natural key, then maybe the entity is not really needed for the end user.
UPD. Here's an article that could be useful: http://www.craftitonline.com/2014/09/doctrine-migrations-with-schema-api-without-symfony-symfony-cmf-seobundle-sylius-example/
Generally speaking, the bundle embedded the entities that will be loaded via the ORM/ODM using their built-in commands (like doctrine:schema:update, doctrine:migration:diff, ...) and provides a custom command that load the required fixtures using the ODM/ORM
This command can read the fixtures in multiple way (parsing yaml, xml, raw sql, dql, ...), it is just a matter of taste. Tones of bundles, parser, ... exist for those tasks.
In your documentation, you just have to state in a clear way that the developer must run this command after your bundle installation and schema update.
How can I able to find the usage of default tables available in drupal.
Is there any documentation available?
For example: there is a table called node. I need to know what is the usage of it and how it acts.
Any suggestions or answers will be helpful and grateful.
Your question is not very clear (the term "usage" is quite ambiguous), but you could install the Devel module. After setting it up it will show, for every page loaded (home page included), which SQL queries are run.
Every module can add tables to the database. A default Drupal install uses core modules, either required ones or those installed as dependencies of the default installation profile. These modules install their own tables.
Each module declares its tables in its implementation of hook_schema. The Schema module use the information from the implementations of this hook to provide a schema documentation.
Most of the time, you shouldn't directly access the database but use the API provided by the modules managing the data. Tables are usually considered private for their modules. New release of a module may change its schema in an incompatible way. Using API is much safer. Unfortunately, sometimes database access is the only option. In these cases, implementation of a data access layer between your code and the database is advised.