Pre-programmed feature installs at apache karaf launch - apache-karaf

I am working on an opendaylight project that uses apache karaf. At the start of the program, I need karaf to have some features installed. Currently (and this works) I manually type this into the shell and the features get installed:
$ bin/karaf
Apache Karaf starting up. Press Enter to open the shell now...
100% [========================================================================]
Karaf started in 9s. Bundle stats: 409 active, 410 total
________ ________ .__ .__ .__ __
\_____ \ ______ ____ ____ \______ \ _____ ___.__.| | |__| ____ | |___/ |_
/ | \\____ \_/ __ \ / \ | | \\__ \< | || | | |/ ___\| | \ __\
/ | \ |_> > ___/| | \| ` \/ __ \\___ || |_| / /_/ > Y \ |
\_______ / __/ \___ >___| /_______ (____ / ____||____/__\___ /|___| /__|
\/|__| \/ \/ \/ \/\/ /_____/ \/
Hit '<tab>' for a list of available commands
and '[cmd] --help' for help on a specific command.
Hit '<ctrl-d>' or type 'system:shutdown' or 'logout' to shutdown OpenDaylight.
opendaylight-user#root>feature:install odl-restconf odl-mdsal-apidocs odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer odl-openflowplugin-nxm-extensions odl-restconf-all odl-openflowplugin-flow-services
opendaylight-user#root>
All this works. However, what I need is for this feature install to occur at the start automatically. I am aware there is a shell.init.script file and I have attempted to add this line to the botton of it:
feature:install odl-restconf odl-mdsal-apidocs odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer odl-openflowplugin-nxm-extensions odl-restconf-all odl-openflowplugin-flow-services
But when I do this, I get the following error:
/opt/opendaylight-0.11.0/etc/shell.init.script: Command not found: feature:install
I am also aware that adding a featuresBoot element to org.apache.karaf.features.cfg
featuresBoot = odl-restconf (I shortened it to just one feature for ease of testing)
And I get this error:
org.apache.felix.resolver.reason.ReasonException: Unable to resolve root: missing requirement [root] osgi.identity; osgi.identity=odl-restconf; type=karaf.feature; version="[1.10.0,1.10.0]"; filter:="(&(osgi.identity=odl-restconf)(type=karaf.feature)(version>=1.10.0)(version<=1.10.0))" [caused by: Unable to resolve odl-restconf/1.10.0: missing requirement [odl-restconf/1.10.0] osgi.identity; osgi.identity=odl-restconf-nb-rfc8040; type=karaf.feature; version="[1.10.0,1.10.0]" [caused by: Unable to resolve odl-restconf-nb-rfc8040/1.10.0: missing requirement [odl-restconf-nb-rfc8040/1.10.0] osgi.identity; osgi.identity=odl-restconf-common; type=karaf.feature; version="[1.10.0,1.10.0]" [caused by: Unable to resolve odl-restconf-common/1.10.0: missing requirement [odl-restconf-common/1.10.0] osgi.identity; osgi.identity=odl-mdsal-broker; type=karaf.feature; version="[1.10.0,1.10.0]" [caused by: Unable to resolve odl-mdsal-broker/1.10.0: missing requirement [odl-mdsal-broker/1.10.0] osgi.identity; osgi.identity=org.opendaylight.controller.sal-binding-broker-impl; type=osgi.bundle; version="[1.10.0,1.10.0]"; resolution:=mandatory [caused by: Unable to resolve org.opendaylight.controller.sal-binding-broker-impl/1.10.0: missing requirement [org.opendaylight.controller.sal-binding-broker-impl/1.10.0] osgi.wiring.package; filter:="(&(osgi.wiring.package=org.osgi.service.blueprint)(version>=1.0.0)(!(version>=2.0.0)))"]]]]]
So, I can tell that karaf is at least acknowledging the shell.init.script and org.apache.karaf.features.cfg, however I am clearly using the wrong syntax. I have exhausted my googling in finding the right syntax or finding examples of this. In fact, when I do google for this issue, I get advice along the lines of "use the shell.init.script" file, without an example or a link to documentation for how to use it.
For example, here is a similar question script to run commands at start of apache karaf but the answers really aren't full or complete. I don't know where to put the suggested commands, etc.
Anyway, I hope someone here can show me what specific command one uses in either shell.init.script or org.apache.karaf.features.cfg to accomplish this goal. (The goal being the equivalent initialization command to accomplish what I can successfully do manually as noted at the top)
Update:
I also tried this type of command:
echo "feature:install odl-restconf odl-mdsal-apidocs odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer odl-openflowplugin-nxm-extensions odl-restconf-all odl-openflowplugin-flow-services" | /opt/opendaylight-0.11.0/bin/karaf
Which also didn't work. I don't get any error messages, I just know my restconf isn't working. Funny enough I did see all the features getting piped into the karaf shell, and the shell did seem to freeze up (seemingly like it was processing the command) but no-go after the cursor returned to me.
Thanks to #jamo as his answer lead me to the answer. I needed to only add this to etc/org.apache.karaf.features.cfg:
featuresBoot = odl-restconf,odl-mdsal-apidocs,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,odl-openflowplugin-nxm-extensions,odl-restconf-all,odl-openflowplugin-flow-services, 25921329-8d07-420b-af13-94948bf1a78d
I believe the trick was having the final 25921329-8d07-420b-af13-94948bf1a78d which is some hex value that is in the default cfg file and I ensure it stayed in there.

we use the org.apache.karaf.features.cfg file exclusively in upstream ODL
system test. you can see it inside this log, but specifically, here is what
it looks like for one of our yangtools jobs:
################################################################################
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
################################################################################
#
# Comma separated list of features repositories to register by default
#
featuresRepositories = mvn:org.opendaylight.integration/features-test/0.12.1-SNAPSHOT/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/5edc7e82-415e-4254-9731-f87670633bcb.xml
#
# Comma separated list of features to install at startup
#
featuresBoot = odl-infrautils-ready,odl-restconf, a3fb0299-0563-4506-b1a0-059253ab43b4
#
# Resource repositories (OBR) that the features resolver can use
# to resolve requirements/capabilities
#
# The format of the resourceRepositories is
# resourceRepositories=[xml:url|json:url],...
# for Instance:
#
#resourceRepositories=xml:http://host/path/to/index.xml
# or
#resourceRepositories=json:http://host/path/to/index.json
#
#
# Defines if the boot features are started in asynchronous mode (in a dedicated thread)
#
featuresBootAsynchronous=false
#
# Service requirements enforcement
#
# By default, the feature resolver checks the service requirements/capabilities of
# bundles for new features (xml schema >= 1.3.0) in order to automatically installs
# the required bundles.
# The following flag can have those values:
# - disable: service requirements are completely ignored
# - default: service requirements are ignored for old features
# - enforce: service requirements are always verified
#
#serviceRequirements=default
#
# Store cfg file for config element in feature
#
#configCfgStore=true
#
# Configuration of features processing mechanism (overrides, blacklisting, modification of features)
# XML file defines instructions related to features processing
# versions.properties may declare properties to resolve placeholders in XML file
# both files are relative to ${karaf.etc}
#
#featureProcessing=org.apache.karaf.features.xml
#featureProcessingVersions=versions.properties
You have a typo in your question with featuresBook, but I am guessing that's
just a typo on not your problem. It's very strange that feature:install works,
but not featuresBoot.

Related

goland: when i execute go mod tidy in goland,why "goproxy.io" is in the package url

go mod tidy in goland
*bitbucket.org/xxxproject/db_proxy_api_model/models/v1 : cannot find module providing package.
reading https://goproxy.io/bitbucket.org/xxxproject/db_proxy_api_model/models/#v/list: 404 Not Found
server response:
not found: module bitbucket.org/xxxproject/db_proxy_api_model/models: git ls-remote -q origin in /tmp/gopath/pkg/mod/cache/vcs/cf011ef4494e04c40886924c664c719ff30fb53c96bff1250e26ef05478bbd13: exit status 128:
fatal: could not read Username for 'https://bitbucket.org': terminal prompts disabled.
Confirm the import path was entered correctly.If this is a private repository, see https://golang.org/doc/faq#git_https for additional information.*
enter image description here
I set goproxy=https://goproxy.io,but its not the point, the problem is also appear in the case of dont use proxy.
what can i do solving the problem
The project seems to be a private one and you should specify GOPRIVATE variable (https://go.dev/ref/mod#private-module-proxy-direct) via Preferences/Settings | Go | Go Modules | Environment. GoLand will be able to pick up the environment.

Inheriting dependencies when running unit tests from command line

I am trying to run Julia unit tests from the command line but the unit tests fail to run because they cannot find a dependency that I am using in my main project. How can I make this work? The actual command that I try to execute is julia test/test_blueprint.jl from the project root. Here follows more details.
Details about the setup
My project is located at the path /home/jonas/prog/julia/blueprint. In that directory, I have a Project.toml file containing these lines:
name = "blueprint"
uuid = "c1615a0c-c255-402d-ae34-0b88819b43c6"
authors = [""]
version = "0.1.0"
[deps]
FunctionalCollections = "de31a74c-ac4f-5751-b3fd-e18cd04993ca"
Setfield = "efcf1570-3423-57d1-acb7-fd33fddbac46"
along with the Manifest.toml file.
I have a subdirectory at test/ with unit tests that I created following this guide and that directory contains another Project.toml file containing
[deps]
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
There is a file test/test_blueprint.jl with unit tests and that file starts with
using Test
include("../src/blueprint.jl") # Alternative 1
#using blueprint # Alternative 2
using FunctionalCollections
using LinearAlgebra
...
The actual code being tested is in the file src/blueprint.jl.
Details about the problem
From the project root, I attempt to run the unit tests using the command julia test/test_blueprint.jl. When I run that command it produces the following output:
ERROR: LoadError: ArgumentError: Package Setfield not found in current path:
- Run `import Pkg; Pkg.add("Setfield")` to install the Setfield package.
Stacktrace:
[1] require(into::Module, mod::Symbol)
# Base ./loading.jl:967
[2] include(fname::String)
# Base.MainInclude ./client.jl:451
[3] top-level scope
# ~/prog/julia/blueprint/test/test_blueprint.jl:8
in expression starting at /home/jonas/prog/julia/blueprint/src/blueprint.jl:1
in expression starting at /home/jonas/prog/julia/blueprint/test/test_blueprint.jl:8
suggesting that it cannot find the dependency Setfield. If I edit the top of the file test/test_blueprint.jl slightly from
include("../src/blueprint.jl") # Alternative 1
#using blueprint # Alternative 2
to
#include("../src/blueprint.jl") # Alternative 1
using blueprint # Alternative 2
it still fails, but with a different error:
ERROR: LoadError: ArgumentError: Package blueprint not found in current path:
- Run `import Pkg; Pkg.add("blueprint")` to install the blueprint package.
Stacktrace:
[1] require(into::Module, mod::Symbol)
# Base ./loading.jl:967
in expression starting at /home/jonas/prog/julia/blueprint/test/test_blueprint.jl:9
Question: How can I make the unit tests run from the command line?
Note that I can run the unit tests from within the Julia REPL in Emacs by activating the project using C-c C-a at the src/blueprint.jl file and calling C-c C-b at the unit test file test/test_blueprint.jl. My Julia version is 1.7.0 (2021-11-30). Don't hesitate to ask for more clarifications.
First, a few naming conventions that are probably not (but may be) contributing to the issues here:
By convention, package names begin with a single capital, so I would recommend changing the name to Blueprint everywhere
By default, ] test runs tests found in the test/runtests.jl, so I would recommend naming your top-level testing script runtests.jl to avoid confusion, even though it does seem from the errors here that test is finding your test_blueprint.jl file one way or another.
Now, while I can't test this without the full code of your package, what I suspect is happening here is the following:
Normally, dependencies of the package you are testing (let's say MyPackage) are not required in test/Project.toml because they are implicit in MyPackage. So after a successful using MyPackage, while they will still not be available to any functions written in your test scripts (test/runtests.jl), will be available to the functions written in MyPackage -- just as if you had typed ]using MyPackage at the REPL and then run your test code there. This is the only reason you don't normally need to duplicate all the deps from the main Project.toml in test/Project.toml.
Since the using Blueprint approach is failing here for other reasons, when you simply include the code from src/blueprint.jl, the usings within that file will in turn fail because those packages are not present in the active environment at test/Project.toml (even if they are present on your system elsewhere).
Consequently, one quick fix to your problem with the current include("../src/blueprint.jl") approach would be to simply add those dependencies to your test/Project.toml
However, it would be more satisfying to fix the problem you are having with using Blueprint. I don't have enough information to debug this without seeing the full structure of your packages, but I would suggest as a start
making sure that your code is properly structured as a package
testing that, even if unregistered, you can ] add your package from the REPL by git repo URL (i.e. ] add https://some_website.com/you/Blueprint.jl)
EDIT:
Upon inspection of the code linked in the comments (https://github.com/jonasseglare/Blueprint), a few other issues:
Although they are already installed by default, standard libraries these days do need to be included in [deps]. In this case, that means the LinearAlgebra stdlib
Any packages you are explicitly using in your test scripts, other than your package itself, do need to be added to test/Project.toml. I.e., any packages that you are directly using functions from in your test scripts (rather than just indirectly using via the exported functions of your package) do need to be included in test/Project.toml.
In your case, the latter would appear to mean LinearAlgebra and FunctionalCollections, but not Setfield (that one only needs to be included in the regular Project.toml, since it's not being directly used in runtests.jl).
Consequently, with a few minor changes to your repo we are able to simply
] add https://github.com/brenhinkeller/Blueprint
] test Blueprint
or, since you preferred at the command line
user$ julia -e "using Pkg; Pkg.add(url=\"https://github.com/brenhinkeller/Blueprint\")
user$ julia -e "using Pkg; Pkg.test(\"Blueprint\")"
Testing Blueprint
Status `/private/var/folders/qk/2qyrdb854mvd2tn4crc802lw0000gn/T/jl_fSypP7/Project.toml`
[c1615a0c] Blueprint v0.1.0 `https://github.com/brenhinkeller/Blueprint#master`
[de31a74c] FunctionalCollections v0.5.0
[37e2e46d] LinearAlgebra `#stdlib/LinearAlgebra`
[8dfed614] Test `#stdlib/Test`
Status `/private/var/folders/qk/2qyrdb854mvd2tn4crc802lw0000gn/T/jl_fSypP7/Manifest.toml`
[c1615a0c] Blueprint v0.1.0 `https://github.com/brenhinkeller/Blueprint#master`
[187b0558] ConstructionBase v1.3.0
[de31a74c] FunctionalCollections v0.5.0
[1914dd2f] MacroTools v0.5.9
[ae029012] Requires v1.3.0
[efcf1570] Setfield v0.8.1
[56f22d72] Artifacts `#stdlib/Artifacts`
[2a0f44e3] Base64 `#stdlib/Base64`
[9fa8497b] Future `#stdlib/Future`
[b77e0a4c] InteractiveUtils `#stdlib/InteractiveUtils`
[8f399da3] Libdl `#stdlib/Libdl`
[37e2e46d] LinearAlgebra `#stdlib/LinearAlgebra`
[56ddb016] Logging `#stdlib/Logging`
[d6f4376e] Markdown `#stdlib/Markdown`
[9a3f8284] Random `#stdlib/Random`
[ea8e919c] SHA `#stdlib/SHA`
[9e88b42a] Serialization `#stdlib/Serialization`
[8dfed614] Test `#stdlib/Test`
[cf7118a7] UUIDs `#stdlib/UUIDs`
[e66e0078] CompilerSupportLibraries_jll `#stdlib/CompilerSupportLibraries_jll`
[4536629a] OpenBLAS_jll `#stdlib/OpenBLAS_jll`
[8e850b90] libblastrampoline_jll `#stdlib/libblastrampoline_jll`
Testing Running tests...
Test Summary: | Pass Total
Plane tests | 7 7
Test Summary: | Pass Total
Plane intersection | 2 2
Test Summary: | Pass Total
Plane intersection 2 | 4 4
Test Summary: | Pass Total
Plane shadowing | 3 3
Test Summary: | Pass Total
Polyhedron tests | 3 3
Test Summary: | Pass Total
Polyhedron tests 2 | 5 5
Test Summary: | Pass Total
Beam tests | 2 2
Test Summary: | Pass Total
Half-space test | 2 2
Test Summary: | Pass Total
Ordered pair test | 2 2
Test Summary: | Pass Total
Test plane/line intersection | 2 2
Test Summary: | Pass Total
Update line bounds test | 21 21
Testing Blueprint tests passed
FWIW, you should also be able to mix and match those command-line and REPL approaches (i.e., install in repl, test via command line or vice versa).
While I had not originally considered this case, one additional possibility discussed in the comments is where one wishes to test the local state of a package without, or without relying upon, a git remote; in this case #Rulle reports that activating the package directory, i.e,
julia -e "using Pkg; Pkg.activate(\".\"); Pkg.test(\"Blueprint\")"
or
julia --project=. -e "using Pkg; Pkg.test(\"Blueprint\")"
or equivalently in the REPL
] activate .
] test Blueprint
will work assuming the package directory is currently the local directory .
Possible answer to my own question:
To make it work, specify the main project root directory on the command line when calling the script using --project. In this case, we would call
julia --project=/home/jonas/prog/julia/blueprint test/test_blueprint.jl
However, there seems to be some hidden state that I don't understand, because after this command has been run once, it seems as if the --project option can be omitted. On the other hand, I have also tried to provide a nonsense project directory, e.g. /tmp:
julia --project=/tmp test/test_blueprint.jl
and sometimes it will still run the unit tests (!) and sometimes it won't. But when it fails to run the unit tests, it will succeed again as soon as I specify the correct path, that is /home/jonas/prog/julia/blueprint. I don't understand also how this interacts with whether I use using blueprint or include('../src/blueprint.jl') but it seems as if, when I use using, it works only iff the --project path is set correctly. But I am still not sure.

!!DEVSTACK!! setup.py is running into a error when run into pbr

so I'm trying to install devstach I followed the instructions from here
https://docs.openstack.org/sahara/pike/contributor/devstack.html
and this is devstack that I downloaded
git clone https://git.openstack.org/openstack-dev/devstack.git
and here is my local.conf
[[local|localrc]]
ADMIN_PASSWORD=nova
MYSQL_PASSWORD=nova
RABBIT_PASSWORD=nova
SERVICE_PASSWORD=$ADMIN_PASSWORD
SERVICE_TOKEN=nova
# Enable Swift
enable_service s-proxy s-object s-container s-account
SWIFT_HASH=66a3d6b56c1f479c8b4e70ab5c2000f5
SWIFT_REPLICAS=1
SWIFT_DATA_DIR=$DEST/data
# Force checkout prerequisites
# FORCE_PREREQ=1
# keystone is now configured by default to use PKI as the token format
# which produces huge tokens.
# set UUID as keystone token format which is much shorter and easier to
# work with.
KEYSTONE_TOKEN_FORMAT=UUID
# Change the FLOATING_RANGE to whatever IPs VM is working in.
# In NAT mode it is the subnet VMware Fusion provides, in bridged mode
# it is your local network. But only use the top end of the network by
# using a /27 and starting at the 224 octet.
FLOATING_RANGE=192.168.55.224/27
# Enable logging
SCREEN_LOGDIR=$DEST/logs/screen
# Set ``OFFLINE`` to ``True`` to configure ``stack.sh`` to run cleanly
# without Internet access. ``stack.sh`` must have been previously run
# with Internet access to install prerequisites and fetch repositories.
# OFFLINE=True
# Enable sahara
enable_plugin sahara https://git.openstack.org/openstack/sahara
# Enable heat
enable_plugin heat https://git.openstack.org/openstack/heat
# Setting this machine private IP address
HOST_IP="192.168.1.130"
But the main problem is after I run stack.sh after it goes along and downloads some of the git files like sahara & heat it will come to this error, to be specific about it here is the error
ERROR: Command errored out with exit status 1:
command: /usr/bin/python3.8 -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/opt/stack/neutron/setup.py'"'"'; __file__='"'"'/opt/stack/neutron/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)
(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info
cwd: /opt/stack/neutron/
Complete output (24 lines):
/usr/local/lib/python3.8/dist-packages/setuptools/dist.py:634: UserWarning: Usage of dash-separated 'description-file' will not be supported in future versions. Please use the underscore name 'description_fil
e' instead
warnings.warn(
/usr/local/lib/python3.8/dist-packages/setuptools/dist.py:634: UserWarning: Usage of dash-separated 'author-email' will not be supported in future versions. Please use the underscore name 'author_email' inste
ad
warnings.warn(
/usr/local/lib/python3.8/dist-packages/setuptools/dist.py:634: UserWarning: Usage of dash-separated 'home-page' will not be supported in future versions. Please use the underscore name 'home_page' instead
warnings.warn(
/usr/local/lib/python3.8/dist-packages/setuptools/dist.py:634: UserWarning: Usage of dash-separated 'python-requires' will not be supported in future versions. Please use the underscore name 'python_requires'
instead
warnings.warn(
ERROR:root:Error parsing
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/pbr/core.py", line 96, in pbr
attrs = util.cfg_to_args(path, dist.script_args)
File "/usr/local/lib/python3.8/dist-packages/pbr/util.py", line 271, in cfg_to_args
pbr.hooks.setup_hook(config)
File "/usr/local/lib/python3.8/dist-packages/pbr/hooks/__init__.py", line 25, in setup_hook
metadata_config.run()
File "/usr/local/lib/python3.8/dist-packages/pbr/hooks/base.py", line 27, in run
self.hook()
File "/usr/local/lib/python3.8/dist-packages/pbr/hooks/metadata.py", line 25, in hook
self.config['version'] = packaging.get_version(
File "/usr/local/lib/python3.8/dist-packages/pbr/packaging.py", line 874, in get_version
raise Exception("Versioning for this project requires either an sdist"
Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that there is a mismatch between the package name in setup.cfg and the argu
ment given to pbr.version.VersionInfo. Project name neutron was given, but was not able to be found.
error in setup command: Error parsing /opt/stack/neutron/setup.cfg: Exception: Versioning for this project requires either an sdist tarball, or access to an upstream git repository. It's also possible that th
ere is a mismatch between the package name in setup.cfg and the argument given to pbr.version.VersionInfo. Project name neutron was given, but was not able to be found.
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
I check every where that I should check and understood to check. But I think the main problem is related to pbr and the version of it with nutron or something else.
I'm totally new at this so sorry for not asking this in a better way
I'm installing devstack on a ubuntu server 20.04 vm

Haskell: Could not find module ‘Network.HTTP’

I am trying to write a simple script that takes as input a URL (or set of URLs) and as output it downloads the contents of that page to a file (in particular I am trying to download hundreds of JSON files, which ultimately I wish to diff against other JSON files).
In a file, download.hs, I have import "HTTP" Network.HTTP.
When I run: $ ghc -o download download.hs
I get the following error:
download.hs:24:1: error:
Could not find module ‘Network.HTTP’
Perhaps you meant Network.TLS (needs flag -package-key tls-1.5.2)
|
24 | import "HTTP" Network.HTTP
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
My GHC version is:
$ ghc --version
The Glorious Glasgow Haskell Compilation System, version 8.6.5
I also get errors like:
download.hs:22:1: error:
Could not load module ‘Control.Concurrent.Async’
It is a member of the hidden package ‘async-2.2.2’.
You can run ‘:set -package async’ to expose it.
(Note: this unloads all the modules in the current scope.)
|
22 | import "async" Control.Concurrent.Async (mapConcurrently)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
I think it's possible there have been breaking changes between the ghc versions, and the examples I am finding online to start with may be outdated.
Any pointers on doing started with Haskell, and particularly easy ways to download and diff JSON files in Haskell?
I have been following this example: Running parallel URL downloads in Haskell, this is where I got the code that is erroring now.

Bitbake build for qpid-cpp-1.39.0 fails

I must preface this saying I have very little knowledge of Yocto/Bitbake, make, cmake, autoconf, etc.
I have an application, written in C++, that I have been unable to get built. The issue I am having is that I have been unable to get some dependencies built in Yocto/Bitbake. The application has dependencies on the following libraries:
Paho-MQTT (https://github.com/eclipse/paho.mqtt.c.git)
Jansson JSON parser (http://www.digip.org/jansson/releases/jansson-2.10.tar.gz)
Apache QPID C++ (https://www-us.apache.org/dist/qpid/cpp/1.39.0/qpid-cpp-1.39.0.tar.gz)
When running the bitbake recipe for apache qpid, I am receiving the following error:
CMake Error at src/CMakeLists.txt:84 (message):
| Can't find amqp 0-10 spec for framing code generation
I do not have any ideas how to get past this error.
I am running Docker/Yocto on a Macbook and have followed the instructions to get everything installed and running. The application (and dependent libraries) are being compiled for a Multitech Conduit gateway running mLinux version 4.1.6. I followed the instructions at http://www.multitech.net/developer/software/mlinux/mlinux-building-images/building-a-custom-linux-image/ to download and build the default images for mLinux 4.0 and higher.
I could not find an existing recipe for building qpid-cpp-1.39.0. I was able to find a recipe for qpid_0.20 (http://git.yoctoproject.org/cgit/cgit.cgi/meta-cloud-services/tree/meta-openstack/recipes-extended/qpid/qpid_0.20.bb?h=master) that I attempted to modify to support qpid-cpp-1.39.0. When running bitbake against the recipe, I receive the following output:
pokyuser#8c538668c625:/workdir/mlinux-4.x/build$ bitbake qpid-cpp
NOTE: Started PRServer with DBfile: /workdir/mlinux-4.x/build/cache/prserv.sqlite3, IP: 127.0.0.1, PORT: 41127, PID: 10213
Loading cache: 100% |#############################################################################################################################################| Time: 0:00:01
Loaded 2996 entries from dependency cache.
Parsing recipes: 100% |###########################################################################################################################################| Time: 0:00:01
Parsing of 2213 .bb files complete (2207 cached, 6 parsed). 3001 targets, 177 skipped, 0 masked, 0 errors.
WARNING: No bb files matched BBFILE_PATTERN_user '^/workdir/mlinux-4.x/layers/user-layer/'
NOTE: Resolving any missing task queue dependencies
Build Configuration:
BB_VERSION = "1.32.0"
BUILD_SYS = "x86_64-linux"
NATIVELSBSTRING = "Ubuntu-16.04"
TARGET_SYS = "arm-mlinux-linux-gnueabi"
MACHINE = "mtcdt"
DISTRO = "mlinux"
DISTRO_VERSION = "4.1.7"
TUNE_FEATURES = "arm armv5 thumb dsp arm926ejs"
TARGET_FPU = "soft"
user-layer = "master:c9360c9479287f3ba229c9a37142baa5a22cce67"
meta-mlinux = "HEAD:4a060176a58345749e5907084cf1647f8b8cae23"
meta-multitech = "HEAD:55db4fd0bb04ccaedb10de0b249151a663b0d916"
meta-mono = "HEAD:b8e5da7138c61fb9ade87712a2fc28dc6283ab25"
meta-nodejs = "HEAD:78018dc7dc02b5039a165801d09c00564687a1b6"
meta-java = "HEAD:a265b31ec7d022be254abdf959360a7624208585"
meta-oe
meta-ruby
meta-perl
meta-python
meta-networking
meta-webserver
meta-multimedia
meta-filesystems = "HEAD:fe5c83312de11e80b85680ef237f8acb04b4b26e"
meta = "HEAD:ddf907ca95a19f54785079b4396935273b3747f6"
meta-jansson
meta-paho-mqtt
meta-clearblade-sdk
meta-qpid-cpp = "master:c9360c9479287f3ba229c9a37142baa5a22cce67"
Initialising tasks: 100% |########################################################################################################################################| Time: 0:00:02
NOTE: Executing SetScene Tasks
NOTE: Executing RunQueue Tasks
ERROR: qpid-cpp-1.39.0-r0 do_configure: Function failed: do_configure (log file is located at /workdir/mlinux-4.x/build/tmp/work/arm926ejste-mlinux-linux-gnueabi/qpid-cpp/1.39.0-r0/temp/log.do_configure.10486)
ERROR: Logfile of failure stored in: /workdir/mlinux-4.x/build/tmp/work/arm926ejste-mlinux-linux-gnueabi/qpid-cpp/1.39.0-r0/temp/log.do_configure.10486
Log data follows:
| DEBUG: Executing python function sysroot_cleansstate
| DEBUG: Python function sysroot_cleansstate finished
| DEBUG: Executing shell function do_configure
| -- The C compiler identification is GNU 6.2.0
| -- The CXX compiler identification is GNU 6.2.0
| -- Check for working C compiler: /workdir/mlinux-4.x/build/tmp/sysroots/x86_64-linux/usr/bin/arm-mlinux-linux-gnueabi/arm-mlinux-linux-gnueabi-gcc
| -- Check for working C compiler: /workdir/mlinux-4.x/build/tmp/sysroots/x86_64-linux/usr/bin/arm-mlinux-linux-gnueabi/arm-mlinux-linux-gnueabi-gcc -- works
| -- Detecting C compiler ABI info
| -- Detecting C compiler ABI info - done
| -- Detecting C compile features
| -- Detecting C compile features - done
| -- Check for working CXX compiler: /workdir/mlinux-4.x/build/tmp/sysroots/x86_64-linux/usr/bin/arm-mlinux-linux-gnueabi/arm-mlinux-linux-gnueabi-g++
| -- Check for working CXX compiler: /workdir/mlinux-4.x/build/tmp/sysroots/x86_64-linux/usr/bin/arm-mlinux-linux-gnueabi/arm-mlinux-linux-gnueabi-g++ -- works
| -- Detecting CXX compiler ABI info
| -- Detecting CXX compiler ABI info - done
| -- Detecting CXX compile features
| -- Detecting CXX compile features - done
| -- Build type is "RelWithDebInfo" (has debug symbols)
| -- Found PythonInterp: /usr/bin/python2.7 (found suitable version "2.7.12", minimum required is "2.7")
| -- Found PythonInterp: /usr/bin/python2.7 (found version "2.7.12")
| -- Found PkgConfig: /workdir/mlinux-4.x/build/tmp/sysroots/x86_64-linux/usr/bin/pkg-config (found version "0.29.1")
| -- Found Ruby: /workdir/mlinux-4.x/build/tmp/sysroots/x86_64-linux/usr/bin/ruby (found version "2.2.0")
| -- Could NOT find Doxygen (missing: DOXYGEN_EXECUTABLE)
| -- Could NOT find VALGRIND (missing: VALGRIND_EXECUTABLE)
| -- Found CyrusSASL: /workdir/mlinux-4.x/build/tmp/sysroots/mtcdt/usr/lib/libsasl2.so
| CMake Error at src/CMakeLists.txt:84 (message):
| Can't find amqp 0-10 spec for framing code generation
|
|
| -- Configuring incomplete, errors occurred!
| See also "/workdir/mlinux-4.x/build/tmp/work/arm926ejste-mlinux-linux-gnueabi/qpid-cpp/1.39.0-r0/build/CMakeFiles/CMakeOutput.log".
| WARNING: exit code 1 from a shell command.
| ERROR: Function failed: do_configure (log file is located at /workdir/mlinux-4.x/build/tmp/work/arm926ejste-mlinux-linux-gnueabi/qpid-cpp/1.39.0-r0/temp/log.do_configure.10486)
ERROR: Task (/workdir/mlinux-4.x/layers/meta-qpid-cpp/recipes-qpid-cpp/qpid-cpp/qpid-cpp_1.39.0.bb:do_configure) failed with exit code '1'
NOTE: Tasks Summary: Attempted 1273 tasks of which 1267 didn't need to be rerun and 1 failed.
Summary: 1 task failed:
/workdir/mlinux-4.x/layers/meta-qpid-cpp/recipes-qpid-cpp/qpid-cpp/qpid-cpp_1.39.0.bb:do_configure
Summary: There was 1 WARNING message shown.
Summary: There was 1 ERROR message shown, returning a non-zero exit code.
The recipe I used is as follows:
DESCRIPTION = "AMQP message brokers"
HOMEPAGE = "http://qpid.apache.org/"
LICENSE = "Apache-2.0"
LIC_FILES_CHKSUM = "file://LICENSE.txt;md5=b1e01b26bacfc2232046c90a330332b3"
SECTION = "mq"
DEPENDS = "boost perl-native python util-linux cyrus-sasl"
RDEPENDS_${PN} = "cyrus-sasl-bin"
SRC_URI = "git://github.com/apache/qpid-cpp;protocol=https"
SRCREV = "0f5d21861f6935ed2e4eb6e21f1d3cef19e22aa5"
S = "${WORKDIR}/git"
#S = "${WORKDIR}/${PN}-${PV}"
inherit cmake python-dir perlnative cpan-base update-rc.d pkgconfig ruby
OECMAKE_FIND_ROOT_PATH_MODE_PROGRAM = "BOTH"
# Env var which tells perl if it should use host (no) or target (yes) settings
export PERLCONFIGTARGET = "${#is_target(d)}"
export PERL_INC = "${STAGING_LIBDIR}${PERL_OWN_DIR}/perl/${#get_perl_version(d)}/CORE"
export PERL_LIB = "${STAGING_LIBDIR}${PERL_OWN_DIR}/perl/${#get_perl_version(d)}"
export PERL_ARCHLIB = "${STAGING_LIBDIR}${PERL_OWN_DIR}/perl/${#get_perl_version(d)}"
export PERL="${STAGING_BINDIR}/perl"
EXTRA_OECONF += " --without-help2man SASL_PASSWD=/usr/sbin/saslpasswd2"
EXTRA_OEMAKE += " CPPFLAGS=-Wno-unused-function \
pyexecdir=${PYTHON_SITEPACKAGES_DIR} \
pythondir=${PYTHON_SITEPACKAGES_DIR} \
"
INITSCRIPT_NAME = "qpidd"
INITSCRIPT_PARAMS = "defaults"
Knowing that I have just started learning Yocto/Bitbake just three days ago, I'm hoping that someone out there can help me determine what the problem is so that I can compile the apache qpid-cpp-1.39.0 library for mLinux.
According to find_file documentation:
The CMake variable CMAKE_FIND_ROOT_PATH specifies one or more
directories to be prepended to all other search directories. This
effectively “re-roots” the entire search under given locations. Paths
which are descendants of the CMAKE_STAGING_PREFIX are excluded from
this re-rooting, because that variable is always a path on the host
system. By default the CMAKE_FIND_ROOT_PATH is empty.
And since cmake called from bitbake environment is effectively a cross-compilation, then this variable should be also ignored. I added a patch files/0001-Qpid-cross-compile.patch, which worked for me:
diff --git a/src/CMakeLists.txt b/src/CMakeLists.txt
index 82141efdb..3ba403a32 100644
--- a/src/CMakeLists.txt
+++ b/src/CMakeLists.txt
## -78,7 +78,7 ## endif (NOT CMAKE_SYSTEM_NAME STREQUAL Windows AND BUILD_TESTING)
# rubygen subdir is excluded from stable distributions
# If the main AMQP spec is present, then check if ruby and python are
# present, and if any sources have changed, forcing a re-gen of source code.
-find_file(QPID_AMQP_SPEC NAMES amqp.0-10-qpid-errata.stripped.xml PATHS ${qpid-cpp_SOURCE_DIR}/specs ${qpid-cpp_SOURCE_DIR}/../specs NO_DEFAULT_PATH)
+find_file(QPID_AMQP_SPEC NAMES amqp.0-10-qpid-errata.stripped.xml PATHS ${qpid-cpp_SOURCE_DIR}/specs ${qpid-cpp_SOURCE_DIR}/../specs NO_DEFAULT_PATH NO_CMAKE_FIND_ROOT_PATH)
mark_as_advanced(QPID_AMQP_SPEC)
if (NOT QPID_AMQP_SPEC)
message(FATAL_ERROR "Can't find amqp 0-10 spec for framing code generation")
## -106,7 +106,7 ## else (regen_amqp)
message(STATUS "No need to generate AMQP protocol sources")
endif (regen_amqp)
-find_file(QPID_BROKER_MANAGEMENT_SPEC NAMES management-schema.xml PATHS ${CMAKE_CURRENT_SOURCE_DIR}/qpid/broker ${qpid-cpp_SOURCE_DIR}/../specs NO_DEFAULT_PATH)
+find_file(QPID_BROKER_MANAGEMENT_SPEC NAMES management-schema.xml PATHS ${CMAKE_CURRENT_SOURCE_DIR}/qpid/broker ${qpid-cpp_SOURCE_DIR}/../specs NO_DEFAULT_PATH NO_CMAKE_FIND_ROOT_PATH)
mark_as_advanced(QPID_BROKER_MANAGEMENT_SPEC)
if (NOT QPID_BROKER_MANAGEMENT_SPEC)
message(FATAL_ERROR "Can't find broker management spec for code generation")

Resources