cocoapods dylib dependency use_frameworks - sqlite

I have built a dynamic library (to add ICU support in this case) which i need to add as a dependency to a pod. For that I created a pod with the following podspec (I removed things like authors, license, ... to keep it short)
Pod::Spec.new do |s|
s.name = 'unicode'
s.version = '57.0'
s.source = { :git => "git#bitbucket.org:mycompany/unicode.git", :tag => "#{s.version}" }
s.requires_arc = false
s.platform = :ios, '8.0'
s.default_subspecs = 'all'
s.subspec 'all' do |ss|
ss.header_mappings_dir = 'icu4c/include'
ss.source_files = 'icu4c/include/**/*.h'
ss.public_header_files = 'icu4c/include/**/*.h'
ss.vendored_libraries = 'Frameworks/lib*.dylib'
end
end
Here i have a second pod where i need to link these libraries too
Pod::Spec.new do |s|
s.name = 'sqlite3'
s.version = '3.14.2'
s.summary = 'SQLite is an embedded SQL database engine'
s.documentation_url = 'https://sqlite.org/docs.html'
s.homepage = 'https://github.com/clemensg/sqlite3pod'
s.authors = { 'Clemens Gruber' => 'clemensgru#gmail.com' }
v = s.version.to_s.split('.')
archive_name = "sqlite-amalgamation-"+v[0]+v[1].rjust(2, '0')+v[2].rjust(2, '0')+"00"
#s.source = { :http => "https://www.sqlite.org/#{Time.now.year}/#{archive_name}.zip" }
s.source = { :git => "git#bitbucket.org:wrthphoenixspeedy/sqlite3.git", :tag => "#{s.version}" }
s.requires_arc = false
s.platform = :ios, '8.0'
s.default_subspecs = 'common'
s.subspec 'common' do |ss|
ss.source_files = "#{archive_name}/sqlite*.{h,c}"
ss.osx.pod_target_xcconfig = { 'OTHER_CFLAGS' => '$(inherited) -DHAVE_USLEEP=1' }
# Disable OS X / AFP locking code on mobile platforms (iOS, tvOS, watchOS)
sqlite_xcconfig_ios = { 'OTHER_CFLAGS' => '$(inherited) -DHAVE_USLEEP=1 -DSQLITE_ENABLE_LOCKING_STYLE=0' }
ss.ios.pod_target_xcconfig = sqlite_xcconfig_ios
ss.tvos.pod_target_xcconfig = sqlite_xcconfig_ios
ss.watchos.pod_target_xcconfig = sqlite_xcconfig_ios
end
# enable support for icu - International Components for Unicode
s.subspec 'icu' do |ss|
ss.dependency 'sqlite3/common'
ss.pod_target_xcconfig = { 'OTHER_CFLAGS' => '$(inherited) -DSQLITE_ENABLE_ICU=1' }
ss.dependency 'unicode', '57.0'
ss.libraries = 'icucore', 'icudata.57.1', 'icui18n.57.1', 'icuio.57.1', 'icule.57.1', 'iculx.57.1', 'icutu.57.1', 'icuuc.57.1'
end
end
And with these i am able to compile it. Cocoapods is copying these libraries on build time into the folder ../Frameworks/ rather than to do while on run time. Instead it fails because it says that it doesn't find the library in ../lib.
dyld: Library not loaded: ../lib/libicudata.57.1.dylib
Referenced from: /var/containers/Bundle/Application/9663CB3A-6ACD-487E-A92D-48F8AFE5260C/MyApp.app/MyApp
Reason: image not found
I have to use use_frameworks! because i am using some Swift frameworks too.
So i am doing something wrong... the question is, can i link a dylib from one pod to another pod? and if so... how?

Based on the disparity between "libs" and "Frameworks", this looks like an issue with either runpath search paths (the running app is not looking for the library from Frameworks), or with the install name of the library not matching the location where it's placed relative to where it is dynamically loaded from.
Make sure that in the app that bundles the dynamic library you have the following paths included in your "Runpath Search Path": #executable_path/../Frameworks, #loader_path/../Frameworks
Make sure that the "Dynamic Library Install Name" name of the library being loaded is set to the equivalent of #rpath/$(EXECUTABLE_PATH) (i.e. in your case it should be "#rpath/libicudata.57.1.dylib"). You can set it during build time using the -install_name compiler (linker?) flag, or with install_name_tool, like so: install_name_tool -id "#rpath/libicudata.57.1.dylib" libicudata.57.1.dylib . Hopefully doesn't come to this though.

Related

How to use rules_webtesting?

I want to use https://github.com/bazelbuild/rules_webtesting. I am using Bazel 5.2.0.
The whole project can be found here.
My WORKSPACE.bazel file looks like this:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "io_bazel_rules_webtesting",
sha256 = "3ef3bb22852546693c94e9b0b02c2570e74abab6f800fd58e0cbe79492e49c1b",
urls = [
"https://github.com/bazelbuild/rules_webtesting/archive/581b1557e382f93419da6a03b91a45c2ac9a9ec8/rules_webtesting.tar.gz",
],
)
load("#io_bazel_rules_webtesting//web:repositories.bzl", "web_test_repositories")
web_test_repositories()
My BUILD.bazel file looks like this:
load("#io_bazel_rules_webtesting//web:py.bzl", "py_web_test_suite")
py_web_test_suite(
name = "browser_test",
srcs = ["browser_test.py"],
browsers = [
"#io_bazel_rules_webtesting//browsers:chromium-local",
],
local = True,
deps = ["#io_bazel_rules_webtesting//testing/web"],
)
browser_test.py looks like this:
import unittest
from testing.web import webtest
class BrowserTest(unittest.TestCase):
def setUp(self):
self.driver = webtest.new_webdriver_session()
def tearDown(self):
try:
self.driver.quit()
finally:
self.driver = None
# Your tests here
if __name__ == "__main__":
unittest.main()
When I try to do a bazel build //... I get (under Ubuntu 20.04 and macOS):
INFO: Invocation ID: 74c03efd-9caa-4174-9fda-42f7ff37e38b
ERROR: error loading package '': Every .bzl file must have a corresponding package, but '#io_bazel_rules_webtesting//web:repositories.bzl' does not have one. Please create a BUILD file in the same or any parent directory. Note that this BUILD file does not need to do anything except exist.
INFO: Elapsed time: 0.038s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
The error message does not make sense to me, since there is a BUILD file in
https://github.com/bazelbuild/rules_webtesting/blob/581b1557e382f93419da6a03b91a45c2ac9a9ec8/BUILD.bazel
and https://github.com/bazelbuild/rules_webtesting/blob/581b1557e382f93419da6a03b91a45c2ac9a9ec8/web/BUILD.bazel.
I also tried a different version of Bazel - but with the same result.
Any ideas on how to get this working?
You need to add a strip_prefix = "rules_webtesting-581b1557e382f93419da6a03b91a45c2ac9a9ec8" in your http_archive call.
For debugging, you can look in the folder where Bazel extracts it: bazel-out/../../../external/io_bazel_rules_webtesting. #io_bazel_rules_webtesting//web translates to bazel-out/../../../external/io_bazel_rules_webtesting/web, so if that folder doesn't exist things won't work.

nix-shell script does nothing when using script

I'm quite new to Nix and I'm trying to create a very simple shell.nix script file.
Unfortunately I need an old package: mariadb-10.4.21. After reading and searching a bit I found out that version 10.4.17 (would've been nice to have the exact version but I couldn't find it) is in channel nixos-20.09, but when I do
$ nix-shell --version
nix-shell (Nix) 2.5.1
$ cat shell.nix
let
pkgs = import <nixpkgs> {};
# git ls-remote https://github.com/nixos/nixpkgs nixos-20.09
pkgs-20_09 = import (builtins.fetchGit {
name = "nixpks-20.09";
url = "https://github.com/nixos/nixpkgs";
ref = "refs/heads/nixos-20.09";
rev = "1c1f5649bb9c1b0d98637c8c365228f57126f361";
}) {};
in
pkgs.stdenv.mkDerivation {
pname = "test";
version = "0.1.0";
buildInputs = [
pkgs-20_09.mariadb
];
}
$ nix-shell
it just waits indefinitely without doing anything. But if I do
$ nix-shell -p mariadb -I nixpkgs=https://github.com/NixOS/nixpkgs/archive/1c1f5649bb9c1b0d98637c8c365228f57126f361.tar.gz
[...]
/nix/store/yias2v8pm9pvfk79m65wdpcby4kiy91l-mariadb-server-10.4.17
[...]
copying path '/nix/store/yias2v8pm9pvfk79m65wdpcby4kiy91l-mariadb-server-10.4.17' from 'https://cache.nixos.org'...
[nix-shell:~/Playground]$ mariadb --version
mariadb Ver 15.1 Distrib 10.4.17-MariaDB, for Linux (x86_64) using readline 5.1
it works perfectly.
What am I doing wrong in the script for it to halt?
EDIT: I got a bit more info by running
$ nix-shell -vvv
[...]
did not find cache entry for '{"name":"nixpks-20.09","rev":"1c1f5649bb9c1b0d98637c8c365228f57126f361","type":"git"}'
did not find cache entry for '{"name":"nixpks-20.09","ref":"refs/heads/nixos-20.09","type":"git","url":"https://github.com/nixos/nixpkgs"}'
locking path '/home/test/.cache/nix/gitv3/17blyky0ja542rww32nj04jys1r9vnkg6gcfbj83drca9a862hwp.lock'
lock acquired on '/home/test/.cache/nix/gitv3/17blyky0ja542rww32nj04jys1r9vnkg6gcfbj83drca9a862hwp.lock.lock'
fetching Git repository 'https://github.com/nixos/nixpkgs'...
Is it me or it seems like it's trying to fetch from two different sources? As far as I understood all three url, rev and ref are needed for git-fetching, but it looks like if it's splitting them.
EDIT2: I've been trying with fetchFromGitHub
pkgs-20_09 = import (pkgs.fetchFromGitHub {
name = "nixpks-20.09";
owner = "nixos";
repo = "nixpkgs";
rev = "1c1f5649bb9c1b0d98637c8c365228f57126f361";
sha256 = "0f2nvdijyxfgl5kwyb4465pppd5vkhqxddx6v40k2s0z9jfhj0xl";
}) {};
and fetchTarball
pkgs-20_09 = import (builtins.fetchTarball "https://github.com/NixOS/nixpkgs/archive/1c1f5649bb9c1b0d98637c8c365228f57126f361.tar.gz") {};
and both work just fine. I'll use fetchFromGitHub from now on but it'd be interesting to now why fetchGit doesn't work.

Yocto recipe pyinstaller

I'm trying to build pyinstaller recipe, I used pipoe here , but I got this error while i Told it to inherit pypi setuptools. Can anyone help please?
THank you .
ERROR: ParseError at /home/yasmine/yocto/poky/meta-pyinst/recipes-pyinstaller/pyinstaller/python-altgraph_0.17.bb:16: Could not inherit file classes/setuptools.bbclass
First, it is good practice to use pipoe to create python recipes automatically.
Check my response here on how to use it.
I used it to create pyinstaller recipe, it detected that pyinstaller depends, in run time RDEPENDS, on:
python3-altgraph
python3-pyinstaller-hooks-contrib
So, here are the recipes:
python3-pyinstaller_4.5.1.bb
SUMMARY = "PyInstaller bundles a Python application and all its dependencies into a single package."
HOMEPAGE = "http://www.pyinstaller.org/"
AUTHOR = "Hartmut Goebel, Giovanni Bajo, David Vierra, David Cortesi, Martin Zibricky <>"
LICENSE = "CLOSED"
SRC_URI = "https://files.pythonhosted.org/packages/a9/d9/9fdfb0ac2354d059e466d562689dbe53a23c4062019da2057f0eaed635e0/pyinstaller-4.5.1.tar.gz"
SRC_URI[md5sum] = "cd1fab890e538ed62ac9121e043632e3"
SRC_URI[sha256sum] = "30733baaf8971902286a0ddf77e5499ac5f7bf8e7c39163e83d4f8c696ef265e"
S = "${WORKDIR}/pyinstaller-4.5.1"
RDEPENDS_${PN} = "python3-setuptools python3-altgraph python3-pyinstaller-hooks-contrib"
DEPENDS += "python3-wheel python3-wheel-native"
inherit setuptools3
python3-pyinstaller-hooks-contrib_2021.2.bb
SUMMARY = "Community maintained hooks for PyInstaller"
HOMEPAGE = "https://github.com/pyinstaller/pyinstaller-hooks-contrib"
AUTHOR = " <>"
LICENSE = "CLOSED"
LIC_FILES_CHKSUM = "file://LICENSE;md5=822bee463f4e00ac4478593130e95ccb"
SRC_URI = "https://files.pythonhosted.org/packages/eb/fa/fe062e44776ab8edb4ac62daca1a02bb744ebdd556ec7a75c19c717e80b4/pyinstaller-hooks-contrib-2021.2.tar.gz"
SRC_URI[md5sum] = "322f5534dd0df2d3fbb8fd55ec7cddbf"
SRC_URI[sha256sum] = "7f5d0689b30da3092149fc536a835a94045ac8c9f0e6dfb23ac171890f5ea8f2"
S = "${WORKDIR}/pyinstaller-hooks-contrib-2021.2"
RDEPENDS_${PN} = ""
inherit setuptools3
python3-altgraph_0.17.bb
SUMMARY = "Python graph (network) package"
HOMEPAGE = "https://altgraph.readthedocs.io"
AUTHOR = "Ronald Oussoren <ronaldoussoren#mac.com>"
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://LICENSE;md5=3590eb8d695bdcea3ba57e74adf8a4ed"
SRC_URI = "https://files.pythonhosted.org/packages/22/5a/ac50b52581bbf0d8f6fd50ad77d20faac19a2263b43c60e7f3af8d1ec880/altgraph-0.17.tar.gz"
SRC_URI[md5sum] = "9450020282270749db205038b8c90b55"
SRC_URI[sha256sum] = "1f05a47122542f97028caf78775a095fbe6a2699b5089de8477eb583167d69aa"
S = "${WORKDIR}/altgraph-0.17"
RDEPENDS_${PN} = ""
inherit setuptools3
If you have a custom layer, you can create:
meta-custom/recipes-python/pyinstaller
and put all three recipes inside that.
Now, just add python3-pyinstaller to IMAGE_INSTALL :
IMAGE_INSTALL_append = " python3-pyinstaller"
It could be that your setup only have python3 so you either change the inherit from setuptools to setuptools3, or your tell pipoe to use python3 by typing:
pipoe --python python3 --package pyinstaller
If you then read the generated files you would see that it inherits setuptools3.

SBT terminates silently when Git repository for build plugin can't be downloaded

SBT is silently failing when it can't download a plugin via SSH from a Git repository.
This is the output of SBT when it's trying to download the repository:
[info] Updating ProjectRef(uri("ssh://git#repository.com/plugin.git"), "plugin")...
# (nothing after that line)
And it just terminates after that with no explanation. This is very likely a bug with SBT's downloading of plugins via SSH from a Git repository.
When downloading the plugin succeeds, this line is printed:
[info] Done updating.
So for some reason, SBT isn't stating what's wrong, even when executed like this:
sbt -Xdebug test
Here are the relevant configuration files:
# project/build-properties
sbt.version=1.1.5
# project/plugins.sbt
lazy val buildPlugin = RootProject(uri("ssh://git#repository.com/plugin.git"))
lazy val root = (project in file(".")).dependsOn(buildPlugin)
Questions:
1. How can I get SBT to print more debugging information?
2. Where in the SBT code could I fix this bug?
3. How can I build and use my own version of SBT?
How can I get SBT to print more debugging information?
Using the latest launching script available from https://www.scala-sbt.org/download.html (1.2.1 as of August, 2018), you can run:
$ sbt -debug
Where in the SBT code could I fix this bug?
See my answer here https://github.com/sbt/sbt/issues/1120#issuecomment-415553592:
Here are some of the relevant code:
Load.builtinLoader - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/internal/Load.scala#L480-L488
RetrieveUnit - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/internal/RetrieveUnit.scala
Resolvers.git - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/Resolvers.scala#L82-L101
Resolvers.creates - https://github.com/sbt/sbt/blob/v1.2.1/main/src/main/scala/sbt/Resolvers.scala#L145-L155
val git: Resolver = (info: ResolveInfo) => {
val uri = info.uri.withoutMarkerScheme
val localCopy = uniqueSubdirectoryFor(uri.copy(scheme = "git"), in = info.staging)
val from = uri.withoutFragment.toASCIIString
if (uri.hasFragment) {
val branch = uri.getFragment
Some { () =>
creates(localCopy) {
run("git", "clone", from, localCopy.getAbsolutePath)
run(Some(localCopy), "git", "checkout", "-q", branch)
}
}
} else
Some { () =>
creates(localCopy) {
run("git", "clone", "--depth", "1", from, localCopy.getAbsolutePath)
}
}
}
....
def creates(file: File)(f: => Unit) = {
if (!file.exists)
try {
f
} catch {
case NonFatal(e) =>
IO.delete(file)
throw e
}
file
}
How can I build and use my own version of SBT?
https://github.com/sbt/sbt/blob/1.x/CONTRIBUTING.md#build-from-source
For this, you just need sbt/sbt, and publishLocal.

SBT: How to Dockerize a fat jar?

I'm building a Docker image with a fat jar. I use the sbt-assembly plugin to build the jar, and the sbt-native-packager to build the Docker image. I'm not very familiar with SBT and am running into the following issues.
I'd like to declare a dependency on the assembly task from the docker:publish task, such that the fat jar is created before it's added to the image. I did as instructed in the doc, but it's not working. assembly doesn't run until I invoke it.
publish := (publish dependsOn assembly).value
One of the steps in building the image is copying the fat jar. Since assembly plugin creates the jar in target/scala_whatever/projectname-assembly-X.X.X.jar, I need to know the exact scala_whatever and the jar name. assembly seems to have a key assemblyJarName but I'm not sure how to access it. I tried the following which fails.
Cmd("COPY", "target/scala*/*.jar /app.jar")
Help!
Answering my own questions, the following works:
enablePlugins(JavaAppPackaging, DockerPlugin)
assemblyMergeStrategy in assembly := {
case x => {
val oldStrategy = (assemblyMergeStrategy in assembly).value
val strategy = oldStrategy(x)
if (strategy == MergeStrategy.deduplicate)
MergeStrategy.first
else strategy
}
}
// Remove all jar mappings in universal and append the fat jar
mappings in Universal := {
val universalMappings = (mappings in Universal).value
val fatJar = (assembly in Compile).value
val filtered = universalMappings.filter {
case (file, name) => !name.endsWith(".jar")
}
filtered :+ (fatJar -> ("lib/" + fatJar.getName))
}
dockerRepository := Some("username")
import com.typesafe.sbt.packager.docker.{Cmd, ExecCmd}
dockerCommands := Seq(
Cmd("FROM", "username/spark:2.1.0"),
Cmd("WORKDIR", "/"),
Cmd("COPY", "opt/docker/lib/*.jar", "/app.jar"),
ExecCmd("ENTRYPOINT", "/opt/spark/bin/spark-submit", "/app.jar")
)
I completely overwrite the docker commands because the defaults add couple of scripts that I don't need because I overwrite the entrypoint as well. Also, the default workdir is /opt/docker which is not where I want to put the fat jar.
Note that the default commands are shown by show dockerCommands in sbt console.

Resources