Chef Nginx cookbook - override version number - nginx

I have developed a cookbook for my application which depends on Nginx cookbook. I have downloaded the Nginx cookbook from the following location
https://github.com/miketheman/nginx
and tried including the default recipe in my cookbook and overriding version attribute specified in the default attribute file. But irrespective of what i do, Nginx version 1.0.x is installed. I could not track from where it is fetching the version information. Can anyone help resolving this issue?
Thanks

If you specify the nginx cookbook as a dependency in your own wrapper cookbook, you have to deal with the strict load order of attribute files. Since Chef 11, all dependency cookbooks are loaded first, before the cookbook which requires them. As the dependency cookbooks (including nginx) are loaded, the attribute files are loaded and evaluated in this order:
attributes/default.rb of nginx
all other attributes files of nginx in alphabetical order
attributes/default.rb of your cookbook
all other attributes files of your cookbook in alphabetical order
As you can see, all the attributes of the nginx cookbook are initialized before your own attribute files are loaded. Thus, any dependent attributes (i.e. ones which are initializes using values of other existing attributes) use the values defined in the nginx cookbook, not your own.
Now, as you can see node['nginx']['source']['version'] is initialized with node['nginx']['version'] and thus uses the default value. This value is not changed if you just change node['nginx']['version'] later in your cookbook.
But fear not, there is a remedy :) You can reload specific attribute files in order to re-set their attributes. Here, this is rather convenient if you want to overwrite the nginx version. This is what I do in the attributes/default.rb in my nginx wrapper cookbook:
override['nginx']['version'] = '1.6.0'
override['nginx']['source']['checksum'] = '943ad757a1c3e8b3df2d5c4ddacc508861922e36fa10ea6f8e3a348fc9abfc1a'
# Reload nginx::source attributes with our updated version
node.from_file(run_context.resolve_attribute('nginx', 'source'))

from attributes/default.rb the default version is set to '1.4.4'
The simplest way to find out what version you've set it to is look for the following attribute on the chef-server UI:
['nginx']['version']
hopefully this should be set to whatever you've set it to!

I think that the real issue here is that the ['nginx']['version'] does not behave as you might expect it to.
According to the README file ...
If you use the nginx::default or nginx::repo recipes, you will load the latest binary package from either your platform's repository, or from the "stable" repo that is provided by the Nginx maintainers. The version attribute is effectively ignored!!.
The version attribute is only honoured if you use the nginx::source recipe, where it determines the URL of the source archive that is fetched and built.
If you use the nginx::ohai recipe, it updates the version attribute according to the version of Nginx that is currently installed.
Clear yet? If not then:
nginx::default gives you a (typically) old version of Nginx
nginx::repo gives you a (typically) more recent stable version of Nginx
nginx::source is the only recipe that allows you to specify the version of Nginx that you want.
If that doesn't seem to explain what you are seeing ... you need to dive into the recipe source code. The recipe behaviour (e.g. selection of installation repositories) varies across the different platforms / families.

Related

Can I point pre-commit mypy hook to use a requirements.txt for the additional_dependencies?

I would like to use the exactly same version of flake8 in requirements.txt and in .pre-commit-config.yaml.
To avoid redundancy I would like to keep the version number of flake8 exactly once in my repo.
Can pre-commit.com read the version number of flake8 from requirements.txt?
it cannot
pre-commit intentionally does not read from the repository under test as this makes caching intractable
you can read more in this issue and the many duplicate issues linked there
for me, I no longer include flake8, etc. in my requirements files as pre-commit replaces the need to install linters / code formatters elsewhere
disclaimer: I created pre-commit

How stop RStudio from creating empty "R" folder within "/home" directory at every startup

After having set the path for the default working directory as well as my first (and only) project within RStudio options I wonder why RStudio keeps creating an empty folder named "R" within my "/home" directory every time it is started.
Is there any file I could delete/edit (eventually create) to stop this annoying behaviour and if so, where is it located ?
System: Linux Mint v. 19.3
Software: RStudio v. 1.3.959 / R version 3.4.4
Thanks in advance for any hints.
Yes, you can prevent the creation of the R directory — R is configurable via a set of environment variables.
However, setting these correctly isn’t trivial. The first issue is that many R packages are sensitive to the R version they’re installed with. If you upgrade R and try to load the existing package, it may break. Therefore, the R package library path should be specific to the R version.
On clusters, an additional issue is that the same library path might be read by various cluster nodes that run on different architectures; this is rare, but it happens. In such cases, compiled R packages might need to be different depending on the architecture.
Consequently, in general the R library path needs to be specific both to the R version and the system architecture.
Next, even if you configure an alternative path R will silently ignore it if it doesn’t exist. So be sure to manually create the directory that you’ve configured.
Lastly, where to put this configuration? One option would be to put it into the user environment file, the path of which can be specified with the environment variable R_ENVIRON_USER — it defaults to $HOME/.Renviron. This isn’t ideal though, because it means the user can’t temporarily override this setting when calling R: variables in this file override the calling environment.
Instead, I recommend setting this in the user profile (e.g. $HOME/.profile). However, when you use a desktop launcher to launch your RStudio, this file won’t be read, so be sure to edit your *.desktop file accordingly.1
So in sum, add the following to your $HOME/.profile:
export R_LIBS_USER=${XDG_DATA_HOME:-$HOME/.local/share}/R/%p-library/%v
And make sure this directory exists: re-source ~/.profile (launching a new shell inside the current one is not enough), and execute
mkdir -p "$(Rscript -e 'cat(Sys.getenv("R_LIBS_USER"))')"
The above is using the XDG base dir specification, which is the de-facto standard on Linux systems.2 The path is using the placeholders %p and %v. R will fill these in with the system platform and the R version (in the form major.minor), respectively.
If you want to use a custom R configuration file (“user profile”) and/or R environment file, I suggest setting their location in the same way, by configuring R_PROFILE_USER and R_ENVIRON_USER (since their default location, once again, is in the user home directory):
export R_PROFILE_USER=${XDG_CONFIG_HOME:-$HOME/.config}/R/rprofile
export R_ENVIRON_USER=${XDG_CONFIG_HOME:-$HOME/.config}/R/renviron
1 I don’t have a Linux desktop system but I believe that editing the Env entry to the following should do it:
Exec=env R_LIBS_USER=${XDG_DATA_HOME:-$HOME/.local/share}/R/%p-library/%v /path/to/rstudio
2 Other systems require different handling. On macOS, the canonical setting for the library location would be $HOME/Library/Application Support/R/library/%v. However, setting environment variables on macOS for GUI applications is frustratingly complicated.
On Windows, the canonical location is %LOCALAPPDATA%/R/library/%v. To set this variable, use [Environment]::SetEnvironmentVariable in PowerShell or, when using cmd.exe, use setx.

how to make sure a debian package does not have a dependency

I am building a debian package using dpkg.
The package has a dependency on libvirt which is not desired.
The rules file does not specify this dependency, but it is added by dpkg, I suppose due to some calls to libvirt-dev at build time.
However my package works fine without libvirt. As such, libvirt is a "Recommended" package but not "Required". How do I override this dependency and make sure it is not present in my final deb file ?
Hard to know without seeing your actual package, but I'd guess that you have a binary or shared library which is linked against libvirt. That would cause dh_shlibdeps to include libvirt in the ${shlibs:Depends} substvar.
If that's your problem, then the right fix depends on what's getting linked to libvirt. It should be straightforward to determine; just run ldd on each binary or shared library object in your package, and grep for "libvirt".
If the thing linked against libvirt is only incidental to the package, and isn't part of the main functionality, then using Recommends: would indeed be the right thing. To make dh_shlibdeps exclude that object from its dependency scanning, give it a -X option. Example target for debian/rules, assuming debhelper7-style packaging:
override_dh_shlibdeps:
dh_shlibdeps -Xname_of_your_object_to_exclude
If the thing(s) linked to libvirt actually are an important part of the package functionality, then the generated libvirt dependency is appropriate. If you still don't want it, you'll need to work out how to avoid linking against libvirt during your build.

Compiling haskell module Network on win32/cygwin

I am trying to compile Network.HTTP (http://hackage.haskell.org/package/network) on win32/cygwin. However, it does fail with following message:
Setup.hs: Missing dependency on a foreign library:
* Missing (or bad) header file: HsNet.h
This problem can usually be solved by installing the system package that
provides this library (you may need the "-dev" version). If the library is
already installed but in a non-standard location then you can use the flags
--extra-include-dirs= and --extra-lib-dirs= to specify where it is.
If the header file does exist, it may contain errors that are caught by the C
compiler at the preprocessing stage. In this case you can re-run configure
with the verbosity flag -v3 to see the error messages.
Unfortuntely it does not give more clues. The HsNet.h includes sys/uio.h which, actually should not be included, and should be configurered correctly.
Don't use cygwin, instead follow Johan Tibells way
Installing MSYS
Install the latest Haskell Platform. Use the default settings.
Download version 1.0.11 of MSYS. You'll need the following files:
MSYS-1.0.11.exe
msysDTK-1.0.1.exe
msysCORE-1.0.11-bin.tar.gz
The files are all hosted on haskell.org as they're quite hard to find in the official MinGW/MSYS repo.
Run MSYS-1.0.11.exe followed by msysDTK-1.0.1.exe. The former asks you if you want to run a normalization step. You can skip that.
Unpack msysCORE-1.0.11-bin.tar.gz into C:\msys\1.0. Note that you can't do that using an MSYS shell, because you can't overwrite the files in use, so make a copy of C:\msys\1.0, unpack it there, and then rename the copy back to C:\msys\1.0.
Add C:\Program Files\Haskell Platform\VERSION\mingw\bin to your PATH. This is neccesary if you ever want to build packages that use a configure script, like network, as configure scripts need access to a C compiler.
These steps are what Tibell uses to compile the Network package for win and I have used this myself successfully several times on most of the haskell platform releases.
It is possible to build network on win32/cygwin. And the above steps, though useful (by Jonke) may not be necessary.
While doing the configuration step, specify
runghc Setup.hs configure --configure-option="--build=mingw32"
So that the library is configured for mingw32, else you will get link or "undefined references" if you try to link or use network library.
This combined with #Yogesh Sajanikar's answer made it work for me (on win64/cygwin):
Make sure the gcc on your path is NOT the Mingw/Cygwin one, but the
C:\ghc\ghc-6.12.1\mingw\bin\gcc.exe
(Run
export PATH="/cygdrive/.../ghc-7.8.2/mingw/bin:$PATH"
before running cabal install network in the Cygwin shell)

Cannot update zope.schema in Plone

Very new to setting up Plone 4 and trying to integrate Solgema.fullcalendar but when running buildout I get an error saying it needs zope.schema 3.6.0 and I have 3.5.4. I cannot for the life of me work out how to update it. I assume I am missing something fundamental here but it is doing my head in as I imagine as I will encounter this kind of issue again and again as I progress.
" Installing instance.
Error: There is a version conflict.
We already have: zope.schema 3.5.4
but z3c.form 2.4.2 requires 'zope.schema>=3.6.0'."
Looked around and noticed that putting zope.schema>=3.6.0 in eggs might work but that didn't actually trigger an update just caused a bad install error.
If anyone has any ideas or needs something more to go on please let me know!
Thanks
Chris
If you want to use z3c.form inside Plone, best update to Plone 4.1 which is currently available as a release candidate. 4.1 comes with z3c.form in it and has the newer zope.schema version.
In the general case you will need to have a versions section in your buildout configuration in which you can specify exact version requirements for all distributions you want.
[buildout]
extends = ...
versions = versions
[versions]
zope.schema = 3.6.0
Inside the setup.py files you should never specify exact version requirements. Only put minimum requirements into these if your specific library absolutely requires a new feature from another library.
See Hanno's answer. I will add that I cannot think of a good reason anymore to use '>=' (or '<=' or '==') to specify minimum, maximum or exact versions anywhere in a buildout config. Version specifications should only be in a [versions] section. It has been a while since I last used a buildout config that used the comparison operators, but I remember it could lead to problems, especially when upgrading; the only way out would at times be to remove the '.installed.cfg' file to make bin/buildout run in a fresh state.
(Note that '>=' in a setup.py is perfectly fine.)

Resources