How to know which ZSH configuration framework is installed? - zsh

I'm trying to automate installing a ZSH plugin depending on which configuration framework the user is running (For example - prezto or oh-my-zsh) since the install location of the plugin changes with the change in config framework.
For example, for oh-my-zsh the plugins must be installed in ~/.oh-my-zsh/custom/plugins folder whereas in prezto they must be installed in the ~/.prezto/modules folder.
Is there a way I could determine the configuration framework or a workaround to install the plugin in both these cases? Thanks in advance.

Regardless of the framework in use, the location of the plugins is configurable by the user, so detecting a framework only lets you predict the default location of the plugin directory, not the directory actually in use.
Just use an environment variable like PLUGINDIR in your installer, and let the user be responsible for setting its value to ~/.oh-my-zsh/custom/plugins or ~/.prezto/modules or ~/.config/zsh/oh-my-zsh-plugs, etc., as appropriate.

Related

Use third party composer packages in TYPO3 extensions

I have integrated a Service Worker for receiving Push Notifications in my TYPO3 Extension.
Now I want so send Messages form backend to the clients web-push-php Library.
But how it is possible to integrate the library and its dependencies to TYPO3?
If you set up your project with composer you can just require minishlink/web-push and start using class Minishlink\WebPush\WebPush.
In case you‘re running in "legacy" mode (i.e. classic install without composer) or want to support both you‘ll need a different approach. IMO best practice is bundling composer requirements in .phar files - this way you can keep your IDE clean and your VCS footprint small. There‘s a blog post with a detailed description about phar bundling in TYPO3 extensions.
This method works for most composer requirements following PSR-0 or PSR-4 and should be viable in your case as minishlink/web-push seems to follow PSR-4.
You can even advance this by using scripts you can launch by running composer run <script> in your extension‘s root folder. TYPO3 extension typo3_console holds a composer.json defining such scripts.
If you need to run your extension in a TYPO3 6.2 environment you‘ll need to remove composer.json from extension folder as 6.2 fails coping with "real composer requirements" (i.e. non-TYPO3-extension packages).

How to use a license with JWrapper

So I have been working with the free version of JWrapper for some time now and have been quite pleased with the results; however, I have now purchased a license and would like to use it but I am unable to find the method with which to activate my JWrapper. I do not use the graphical interface version of JWrapper; rather I have created a xml installation file and pass that directly via commandline to JWrapper for building. Is there a special xml tag for specifying the license location? I was unable to find this information on JWrapper's homepage or support docs.
I received an answer from the support team. In order to use the license without using the JWrapperApp gui to build your application you will simply have to have the jwlicense.txt file in the same directory as your jwrapper.jar
After some tests it appears that the jwlicence.txt file must be in the current directory. So the best is to have all the files (jwrapper jar, jwrapper.xml and jwlicence.txt) in the same folder and run the compiler from that folder.

Why is SymfonyRequirements.php excluded from .gitignore?

If I understand it correctly, the SymfonyRequirements.php file (which lives under /app or /var depending on Symfony version) is handled by Composer. I therefore suppose it should be not be tracked by any version control system. However, I see it is excluded from Symfony Standard Edition's .gitignore file:
/var/*
[...]
!var/SymfonyRequirements.php
Edit
Symfony core developer #Stof says in a Github issue:
given that one of the checks is whether you installed vendors, it must
be there before installing them (even though we have an automatic
update of the requirements so that you check the uptodate ones next
time).
This is not very clear to me. Can anybody give any more details about this file and explain why it should or should not be tracked by a VCS?
This file is used by Symfony Check CLI Script to check for minimum requirements of configuring & running a Symfony App. It's a Common Post-Deployment Task.
It checks for current PHP Version/Configurations(php.ini settings) and required PHP Extensions. For example it checks for current setting of date.timezone.
What #stof is trying to say is that you should be able to run the checks even before installing dependencies using composer install. It even checks for dependencies installation itself: checks for existence of vendor/composer directory.
It gives you a good & enough insight about whether the Symfony App has what it needs to be run based on Current PHP configuration.
Note that by adding this file to VCS, you should know there may be changes to this file after updating dependencies later using composer update. So you should remember to commit this file too!.
Please Note that these checks also provide some recommendations(not requirements) to be set. For Example check this recommendation out:
When using the logout handler from the Symfony Security Component, you should have at least PHP 5.4.11 due to PHP bug #63379 (as a workaround, you can also set invalidate_session to false in the security logout handler configuration)
Some other Projects using Symfony also implement their own checks by extending this file, For example checkout Oro Platforms Requirements Check.
The files is used in the check CLI tool that use this files for control the minimal Requirements for Running Symfony. You can find more info in the doc.
Usually is take into account in a version control system, as you can see in the symfony-standard distribution project on github:
https://github.com/symfony/symfony-standard
(of course you can add the files in your custom .gitignore files)
For more precision, this file is used in the command php bin/symfony_requirements in symfony3 and php app/check.php for older, that checks your php/symfony requirements.
See this question Should the changes of SymfonyRequirements.php be included in version control? and the documentation.

Release Symfony2 project to the web

I have almost finished the development of a project developed with Symfony2, and wish to put the project online.
However, I suppose there are a lot of things that need to be done so that everything works ok. I suppose, the dev mode needs to be disabled etc....What needs to be done and how?
What are the most important things to do on a Symfony2 project that will be available to everyone on the web?
I suggest you to use Capifony for deployment. It does a lot of stuff out of the box and you can make it run any custom commands you need. See its documentation for details.
Regarding the dev mode, unless you've removed the IP checks from app_dev.php, you don't have to worry about deploying it. Of course, if you wish, you can tell Capifony to delete it on deployment.
The best way to handle deployment is to create "build" script, which will:
Remove all folders and files with tests from your bundles and vendors.
Remove app_dev.php file
Make sure that app/cache and app/logs are fully writable/readable.
Packs your project into archive (rpm f.e.)
Then, before deployment, you should create tag in your project - so it will mean, that certain version of your application is released (I recommend to follow this git branching model).
Create tag.
Run your build script
Upload archive to host
Unpack
Enjoy your project
Im currently researching the same thing.
The first thing you have to consider is "how professional" you want to deploy. There are a lot of tools you can use:
Continous Integration Server ( e.g. Hudson, Jenkins)
Build Tools (e.g. Phing, Capistrano --> Capifony, Shell scripts)
Versioning Tools (e.g. Git, SVN)
I think the simplest setup is using only a Build tool and i guess you are already using some kind of versioning.
Depending on which tool you use, the setup is different, but I think there are some things you should consider with your application (maybe not all are applicable to your application)
Creating a Tag in your Versioning
Copying the new Code in an folder on production
--> if you are in a new folder you dont need to clear the cache and logs, since these shouldnt be in your versioning the first time.
loading composer (if youre using it)
installing vendors
updating database schema
install assets from your bundles
move symlink from current version to the folder of the new site
These are the things I currently need for my application for production deployment, if you deploy to an test environment you should load fixtures and run your testscripts as well.
One other option that is very well described here is to deploy the Symfony2 application with Apache Ant. Apache Ant is a Java library and command-line tool whose mission is to drive processes described in build files as targets and extension points dependent upon each other.

Manually re-creating an install profile

I've been reading about drupal install profiles, and I'm wondering if there's much of a difference between using a packaged install profile vs. installing core + manually installing the modules listed in the install profiles?
I'd like to do the latter (manually installing each) to control the versions of each module installed, which I can't control with a packaged install profile that may not have been maintained.
But should I or will I be opening the door to something I'm not aware of? Shouldn't the 2 be identical, just one automated and other is manual?
What kiamlaluno said, plus the fact that installation profiles may perform custom configuration of settings on install, might construct custom views/content-types/etc (especially by means of features.module, which you can see heavy use of in OpenAtrium), and might provide other custom code in distro-specific module.
The short answer is, no you can't just replicate an install profile by downloading a clean drupal with all those modules -- best bet is to use the install profile. If you're worried about module versions, just make sure you're using a profile that's actively maintained.
The difference is that an installation profile includes the right version of all the modules it needs.
This means that differently from manually installing each module, you don't need to verify the correct version of the module X that effectively works together the module Y; there are few cases where one module doesn't work well when version A of another module is installed, and you need to install version B of the same module, if you don't want problems.
An installation profile can have a custom installation page that allows you to change some parameters of your site; it also allows the installation profile author to define a patch that needs to be applied a module, in order to fix a bug of the module, or to make it work better with another module.
If you need to set a site to work for a particular purpose, installation profiles are useful for you as they allow you to set the site correctly without to know all the details about how a Drupal site needs to be set.
I believe you can specify the versions of the modules you want to install see

Resources