Does berkshelf support multiple cookbooks? How? - berkshelf

I have two cooksbooks that I created in my Chef repo that I want to try to manage using Berkshelf.
One cookbook is dependent on the other. Both are not in Chef Supermarket. I don't want to add them there just for dependency resolving.
On on berks install this things complains about the other cookbook is missing.
This cookbook only exist as upload to my Chef server. It is not in Chef supermarket and also not in ~/.berkshelf/cookbooks/
Is this possible? Or does Berkshelf require all recipes in one cookbook? It is not possible to have dependencies between two cookbooks that are not in Chef supermarket?
The Berkshelf does not seem to deal with this use case of multiple interdependent cookbooks. There are also no commands to get cookbooks in ~/.berkshelf/cookbooks.

Berkshelf do support multiple cookbooks.
I've experience in managing more than 30 cookbooks in same berkshelf.
Berks can refer cookbooks from local path, git path, github sources and community cookbooks from Chef super market.
Update
Lets take the example of the scenario mentioned in the query. Please carry out the below steps,
You need to create a project repository with your cookbooks and other stuff.
Project repo is nothing but a directory containing other directories for Roles, Databags, Cookbooks and Environments.
Place your cookbooks inside cookbooks directory.
Create a Berksfile in the root of the repo.
Berksfile uses berkshelf. Install latest berkshelf version using gem install berkshelf (I would prefer using bundle install)
Example content of Berksfile
source "https://supermarket.chef.io"
# Mention community cookbooks used, if any
cookbook 'community_cookbook_name'
# Mention cookbook names from your repo
%w(my_cookbook_name_1 my_cookbook_name_2).each do | cookbook_name |
cookbook cookbook_name, path: 'cookbooks/' + cookbook_name
end
Community cookbook dependencies will get resolved using the source url in 1st line.
Now the basic project repo setup has been done. You're ready to run berks install
List items berks list
Please try the above steps and let me know if it's solved your purpose.
In case, if you want to upload the content, follow the steps below,
Create a directory ".chef" inside the project repository
Copy your Chef server knife/pem files into .chef
Run the command mentioned below from inside the project repo.
berks upload
Your cookbooks will be uploaded to Chef server using the command.
You can use knife upload data_bags and knife upload roles, to upload your databags and roles from the repo.

Related

Deploy debian package for multiple distros into Jfrog Artifactory?

What is the right procedure to deploy a debian package built for different distros into the same Jfrog debian artifactory repo?
Just uploading to the same path, but with different deb.distribution properties does not work, they all get uploaded to the same place and clobber the previous upload.
Including the distribution name into the package name is ugly, but would of course work. Is there a better way?
You simply post the different debian to different locations within the Jfrog artifactory repository. The trick is that the repository layout has nothing to do with the aptitude API, which retrieves debians regardless of their location according to the requested metadata (deb.distribution, deb.version etc).

ignore dev dependencies in php composer

I have developed a composer laravel based project that I need to install on a remote production server. The problem is I have limited permission/ access so my option is to "archive" the package( using composer archive) and unpack on the production.
What folders do I need to archive and how can I ignore the dev dependencies of the package as well as vendor dev dependencies?
composer archive is likely not to help you, because this command creates an archive of a defined version of a package.
You probably want to upload the whole working application, and not only one package. You should create a little script that will create the archive file for you, which should do:
checkout the application from the repository in a new directory
run composer install --no-dev to install all required dependencies without dev-dependencies
optionally delete files that are not necessary on the server, like documentation, the .git folder, and other stuff
create the archive file from all these files
optionally upload that archive to the target server and unarchive there
optionally check basic functions and switch to the new uploaded version on the server

Composer misses to install certain files (app/console, AutoLoader.php, app_dev.php, etc.)

I am developing a web application with Symfony 2. The code of my own bundle that forms the heart of my application and some configurations files for application-wide settings are controlled by Git (mostly the directories, src/MyCompany/MyBundle, app/Resources/config, etc.) The rest is under control of Composer (the framework, 3rd party bundles, etc.)
Up to now, I ran a ./composer self-update && ./composer.phar update once in a while, pushed or fetched source code from the origin of my repository and everything has been working well.
Today, I started a new fresh working directory and experienced some odd problems.
I performed
git clone <my git repo url> www
cd www
composer.phar install
The composer.json is part of my repository, hence it normally suffices to excute Composer in order to install the framework and all required bundles to get a fully working copy of my web application.
But today, composer.phar install stopped prematurely complainig about missing files. Luckily, I still had my old working directory, so I could copy over the missing files manually, and restart composer.phar. I had to repeat these steps several times until I ended with a fully working application.
The files that were missing are
app/console
AutoLoader.php
app_dev.php
AppCache.php
I thought that these files are part of the Symfony framework and expected them to be installed by Composer. Fot this reason they are not under control of my revision control system.
I found this related question. The answer is very generic und not particularly helpful. All it says is that for example app/console should be included into revision control, because it is not installed by Composer (any longer) and that there is a change in the directory structure due to the transition from Symfony 2 to 3. But I know for sure that app/console was installed by Composer in the past. Hence, something changed.
This leads me to the following questions
Is there any complete, up-to-date and official documentation
what should be included in the repository
what should be in .gitignore
what is managed by Composer?
Is there any documentation how to do the transistion from the old directory structure to the new one in preperation of Symfony 3?
I thought I read all README.md, all release information and everything in "Living on the Edge" of the Symfony site, but somehow I missed this.
The clean way to install Symfony2 from scratch with composer, is to use the following command:
composer create-project symfony/framework-standard-edition my_project_name
This will ensure that all basic structures are created. After that, you can still insert your customisations from the previous project.
Then you can add everything – except app/config/parameters.yml as well as the contents of vendor/, app/cache and app/logs – to your repository.
About transitioning to SF3, I guess there’ll be an upgrade path as soon as SF3 is stable enough to create such a document.
1.1. that depends how you want people to be able to fetch your bundle
1.2. I share with you my own .gitignore: beware I use git for my own use to have a security for my files, not to allow people to get my bundle:
# Cache and logs (Symfony2)
/app/cache/*
/app/logs/*
!app/cache/.gitkeep
!app/logs/.gitkeep
# Cache and logs (Symfony3)
/var/cache/*
/var/logs/*
!var/cache/.gitkeep
!var/logs/.gitkeep
# Parameters
/app/config/parameters.yml
/app/config/parameters.ini
# Managed by Composer
/app/bootstrap.php.cache
/var/bootstrap.php.cache
/bin/*
!bin/console
!bin/symfony_requirements
/vendor/
# Assets and user uploads
/web/bundles/
/web/uploads/
# PHPUnit
/app/phpunit.xml
/phpunit.xml
# Build data
/build/
# Composer PHAR
1.3. everything that is in composer.json

How to use meteors own non-core packages when running from a git checkout

When running meteor from a git checkout, there are 2 packages available at path
<meteor-path>/packages/non-core
npm-bcrypt/
npm-node-aes-gcm/
How to use/enable these packages (best practise) on own project?
You can use the published versions of your packages already, because they're on Atmosphere:
meteor add npm-bcrypt
If you want to use specifically their git checkout versions, you need to create a packages subdirectory in your app's directory, and symlink there the paths to the packages.
Much easier than symlink is just setting the PACKAGE_DIRS env var. Have a look at https://forums.meteor.com/t/missing-non-core-packages-when-running-meteor-from-checkout/1140 for more details.

Osgi commandline install multiple plugins simultaneously

Is it possible to simultaneously load all the plugins available in a directory in osgi commandline?
path/to/bin/java -jar org.eclipse.osgi_3.6.1.R36x_v20100806 -console -clean
This brought up the osgi console and activated the org.eclipse. ss shows me this
id State Bundle
0 ACTIVE org.eclipse.osgi_3.6.1.R36x_v20100806
I have a bunch of bundles in a directory: /path/to/all/bundles
I can certainly do one-by-one on the osgi console using the following:
osgi> install file:///path/to/bundle/org.springframework.osgi.core
I want to be able to load all bundles at once and next step is to be able to install it as well.
Any pointers?
Thanks!
I agree that you definitely don't want to install all your bundles by hand every time. Installing the single Apache FileInstall bundle will then automatically load anything you put into a directory (./load) by default.
You might be interested in using Apache Karaf it does give you Features, where features is a set of bundles defined in either maven-locations or file-locations. Besides this it also gives you a lot of other benefits for working with OSGi bundles, just for an example you'll have more then 200 commands to work with in the shell and lots of them will help you find problems with your bundles.

Resources