I'm trying to combine 4 projects into one JAR like this
jar {
from {
project(":p1").sourceSets.main.output.classesDir
project(":p2").sourceSets.main.output.classesDir
project(":p3").sourceSets.main.output.classesDir
}
}
It sort of works, there are parts from each three projects there, but incomplete. Whenever there's a common directory like p1/mypackage and p2/mypackage, Gradle fails to merge them and takes (I think) the last one. So instead of combining
p1
mypackage
MyFirst.class
p2
mypackage
MySecond.class
into
mypackage
MyFirst.class
MySecond.class
I get only one class. There's no warning. Is this expected or a bug (I hope so)? Can I avoid it somehow?
As stated in the answer, I was doing all wrong. With
jar {
from {[
project(":p1").sourceSets.main.output.classesDir,
project(":p2").sourceSets.main.output.classesDir,
project(":p3").sourceSets.main.output.classesDir
]}
}
it seems to work.
But this is better:
jar {
from {[":p1", ":p2", ":p3"].collect {project(it).sourceSets.main.output.classesDir}}
}
Only the return value of the closure matters, so the first two lines are no-ops. Also, the necessary task dependencies need to be established. Try:
jar {
from { subprojects.sourceSets.main.output }
}
(SourceSetOutput is Buildable, which means that Gradle can infer the task dependencies automatically.)
Related
When run the XrayImportBuilder step prints a lot of useful stuff to the Log but I can't see any simple way of getting at this information so it can be used from the Jenkinsfile script code. Specifically this appears in the Log:
XRAY_TEST_EXECS: ENT-8327
and I hoping to add this info to the current build description. Ideally the info would be returned from the call, but the result is empty. Alternatives might be to scan the log or I use a curl call and handle all the output - latter feels like a backwards step.
I was successful in extracting that information from the logs generated.
After the Xray import results stage I added:
stage('Extract Variable from log'){
steps {
script {
def logContent = Jenkins.getInstance().getItemByFullName(env.JOB_NAME).getBuildByNumber(Integer.parseInt(env.BUILD_NUMBER)).logFile.text
env.testExecs = (logContent =~ /XRAY_TEST_EXECS:.*/).findAll().first()
echo testExecs
}
}
}
stage('Using variable from another stage') {
steps {
script {
echo "${env.testExecs}"
}
}
You can change the REGEX used to your specific case. I've added the extracted value to a variable so that it can be used in another stages.
In QtCreator 4.2.0 I try to use one *.pro file for building binaries for multiple hardware configurations.
In Build & Run => Build Settings => Build Enviroment I define the enviroment variable TARGET like as follows:
Build Settings A: Variable TARGET has Value bbb
Build Settings B: Variable TARGET has Value desktop
In the pro-file I use the following test functions:
equals($$TARGET,"bbb")
{
message("setting include paths for bbb"))
message($$TARGET)
}
equals($$TARGET,"laptop")
{
message("setting include paths for laptop.")
message($$TARGET)
}
contains($$TARGET,"*bbb*")
{
message("setting include paths for bbb"))
message($$TARGET)
}
contains($$TARGET,"*laptop*")
{
message("setting include paths for laptop.")
message($$TARGET)
}
And I get this output when running qmake:
Project MESSAGE: setting include paths for bbb
Project MESSAGE: bbb
Project MESSAGE: setting include paths for laptop.
Project MESSAGE: bbb
Project MESSAGE: setting include paths for bbb
Project MESSAGE: bbb
Project MESSAGE: setting include paths for laptop.
Project MESSAGE: bbb
Project MESSAGE: setting include paths for bbb
This makes no sense to me an I can't figure what I'm doing wrong here. Why are the parts after testing fro laptop executed?
By the way, I solved my problem by using Scopes. This works perfect for me:
CONFIG += $$(TARGET_HW)
desktop {
message("setting include paths for laptop.")
}
cetec {
message("setting include paths for cetec."))
}
But I'm still interested in the correct way of using test functions.
I provide the correct syntax for the first test, as an example:
equals(TARGET,"bbb") {
message("setting include paths for bbb"))
message($$TARGET)
}
Please notice:
The curly brace is on the same line of the test.
The variable tested has no dollar signs, just the variable name
The opening brace must written on the same line as the condition (https://doc.qt.io/qt-5/qmake-language.html#scope-syntax).
There are so many issues in your question, that there is place for another answer, adding to the previous correct ones:
as #daru says, you need to open the brace in the same line as the test function.
as #p-a-o-l-o says, contains and equals syntax require variable names as first arguments, without $$.
TARGET is an internal variable, that contains the base name of the project file by default. It becomes the name of the executable or library you are building.
You may use an environment variable named TARGET, but then you should assign it to a qmake variable name with another name.
sample code:
TGT=$$(TARGET)
equals(TGT,"bbb") {
message("$$TGT equals bbb"))
message(TGT=$$TGT)
}
equals(TGT,"laptop") {
message("$$TGT equals laptop")
message(TGT=$$TGT)
}
contains(TGT,"bbb") {
message("$$TGT contains bbb"))
message(TGT=$$TGT)
}
contains(TGT,"top") {
message("$$TGT contains top")
message(TGT=$$TGT)
}
I'm writing am openembedded/bitbake recipe for openembedded-classic. My recipe RDEPENDS on keyutils, and everything seems to work, except one thing:
I want to append a single line to the /etc/request-key.conf file installed by the keyutils package. So I added the following to my recipe:
pkg_postinst_${PN} () {
echo 'create ... more stuff ..' >> ${sysconfdir}/request-key.conf
}
However, the intended added line is missing in my resulting image.
My recipe inherits update-rc.d if that makes any difference.
My main question is: How do i debug this? Currently I am constructing an entire rootfs image, and then poke-around in that to see, if the changes show up. Surely there is a better way?
UPDATE:
Changed recipe to:
pkg_postinst_${PN} () {
echo 'create ... more stuff ...' >> ${D}${sysconfdir}/request-key.conf
}
But still no luck.
As far as I know, postinst runs at rootfs creation, and only at first boot if rootfs fails.
So there is a easy way to execute something only first boot. Just check for $D, like this:
pkg_postinst_stuff() {
#!/bin/sh -e
if [ x"$D" = "x" ]; then
# do something at first boot here
else
exit 1
fi
}
postinst scripts are ran at roots time, so ${sysconfdir} is /etc on your host. Use $D${sysconfdir} to write to the file inside the rootfs being generated.
OE-Classic is pretty ancient so you really should update to oe-core.
That said, Do postinst's run at first boot? I'm not sure. Also look in the recipes work directory in the temp directory and read the log and run files to see if there are any clues there.
One more thing. If foo RDEPENDS on bar that just means "when foo is installed, bar is also installed". I'm not sure it makes assertions about what is installed during the install phase, when your postinst is running.
If using $D doesn't fix the problem try editing your postinst to copy the existing file you're trying to edit somewhere else, so you can verify that it exists in the first place. Its possible that you're appending to a file that doesn't exist yet, and the the package that installs the file replaces it.
I'd like the ScalaDoc I generate with sbt to link to external libraries, and in sbt 0.13 we have autoAPIMappings which is supposed to add these links for libraries that declare their apiURL. In practice though, none of the libraries I use provide this in their pom/ivy metadata, and I suspect some of these libraries will never do so.
The apiMappings setting is supposed to help with just that, but it is typed as Map[File, URL] and hence geared towards setting doc urls for unmanaged dependencies. Managed dependencies are declared as instances of sbt.ModuleID and cannot be inserted directly in that map.
Can I somehow populate the apiMappings setting with something that will associate an URL with a managed dependency ?
A related question is: does sbt provide an idiomatic way of getting a File from a ModuleID? I guess I could try to evaluate some classpaths and get back Files to try and map them to ModuleIDs but I hope there is something simpler.
Note: this is related to https://stackoverflow.com/questions/18747265/sbt-scaladoc-configuration-for-the-standard-library/18747266, but that question differs by linking to the scaladoc for the standard library, for which there is a well known File scalaInstance.value.libraryJar, which is not the case in this instance.
I managed to get this working for referencing scalaz and play by doing the following:
apiMappings ++= {
val cp: Seq[Attributed[File]] = (fullClasspath in Compile).value
def findManagedDependency(organization: String, name: String): File = {
( for {
entry <- cp
module <- entry.get(moduleID.key)
if module.organization == organization
if module.name.startsWith(name)
jarFile = entry.data
} yield jarFile
).head
}
Map(
findManagedDependency("org.scalaz", "scalaz-core") -> url("https://scalazproject.ci.cloudbees.com/job/nightly_2.10/ws/target/scala-2.10/unidoc/")
, findManagedDependency("com.typesafe.play", "play-json") -> url("http://www.playframework.com/documentation/2.2.1/api/scala/")
)
}
YMMV of course.
The accepted answer is good, but it'll fail when assumptions about exact project dependencies don't hold. Here's a variation that might prove useful:
apiMappings ++= {
def mappingsFor(organization: String, names: List[String], location: String, revision: (String) => String = identity): Seq[(File, URL)] =
for {
entry: Attributed[File] <- (fullClasspath in Compile).value
module: ModuleID <- entry.get(moduleID.key)
if module.organization == organization
if names.exists(module.name.startsWith)
} yield entry.data -> url(location.format(revision(module.revision)))
val mappings: Seq[(File, URL)] =
mappingsFor("org.scala-lang", List("scala-library"), "http://scala-lang.org/api/%s/") ++
mappingsFor("com.typesafe.akka", List("akka-actor"), "http://doc.akka.io/api/akka/%s/") ++
mappingsFor("com.typesafe.play", List("play-iteratees", "play-json"), "http://playframework.com/documentation/%s/api/scala/index.html", _.replaceAll("[\\d]$", "x"))
mappings.toMap
}
(Including scala-library here is redundant, but useful for illustration purposes.)
If you perform mappings foreach println, you'll get output like (note that I don't have Akka in my dependencies):
(/Users/michaelahlers/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar,http://scala-lang.org/api/2.11.7/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-iteratees_2.11/jars/play-iteratees_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-json_2.11/jars/play-json_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
This approach:
Allows for none or many matches to the module identifier.
Concisely supports multiple modules to link the same documentation.
Or, with Nil provided to names, all modules for an organization.
Defers to the module as the version authority.
But lets you map over versions as needed.
As in the case with Play's libraries, where x is used for the patch number.
Those improvements allow you to create a separate SBT file (call it scaladocMappings.sbt) that can be maintained in a single location and easily copy and pasted into any project.
Alternatively to my last suggestion, the sbt-api-mappings plugin by ThoughtWorks shows a lot of promise. Long term, that's a far more sustainable route than each project maintaining its own set of mappings.
We are moving into Scala/SBT from a Java/Gradle stack. Our gradle builds were leveraging a task called processResources and some Ant filter thing named ReplaceTokens to dynamically replace tokens in a checked-in .properties file without actually changing the .properties file (just changing the output). The gradle task looks like:
processResources {
def whoami = System.getProperty( 'user.name' );
def hostname = InetAddress.getLocalHost().getHostName()
def buildTimestamp = new Date().format('yyyy-MM-dd HH:mm:ss z')
filter ReplaceTokens, tokens: [
"buildsig.version" : project.version,
"buildsig.classifier" : project.classifier,
"buildsig.timestamp" : buildTimestamp,
"buildsig.user" : whoami,
"buildsig.system" : hostname,
"buildsig.tag" : buildTag
]
}
This task locates all the template files in the src/main/resources directory, performs the requisite substitutions and outputs the results at build/resources/main. In other words it transforms src/main/resources/buildsig.properties from...
buildsig.version=#buildsig.version#
buildsig.classifier=#buildsig.classifier#
buildsig.timestamp=#buildsig.timestamp#
buildsig.user=#buildsig.user#
buildsig.system=#buildsig.system#
buildsig.tag=#buildsig.tag#
...to build/resources/main/buildsig.properties...
buildsig.version=1.6.5
buildsig.classifier=RELEASE
buildsig.timestamp=2013-05-06 09:46:52 PDT
buildsig.user=jenkins
buildsig.system=bobk-mbp.local
buildsig.tag=dev
Which, ultimately, finds its way into the WAR file at WEB-INF/classes/buildsig.properties. This works like a champ to record build specific information in a Properties file which gets loaded from the classpath at runtime.
What do I do in SBT to get something like this done? I'm new to Scala / SBT so please forgive me if this seems a stupid question. At the end of the day what I need is a means of pulling some information from the environment on which I build and placing that information into a properties file that is classpath loadable at runtime. Any insights you can give to help me get this done are greatly appreciated.
The sbt-buildinfo is a good option. The README shows an example of how to define custom mappings and mappings that should run on each compile. In addition to the straightforward addition of normal settings like version shown there, you want a section like this:
buildInfoKeys ++= Seq[BuildInfoKey](
"hostname" -> java.net.InetAddress.getLocalHost().getHostName(),
"whoami" -> System.getProperty("user.name"),
BuildInfoKey.action("buildTimestamp") {
java.text.DateFormat.getDateTimeInstance.format(new java.util.Date())
}
)
Would the following be what you're looking for:
sbt-editsource: An SBT plugin for editing files
sbt-editsource is a text substitution plugin for SBT 0.11.x and
greater. In a way, it’s a poor man’s sed(1), for SBT. It provides the
ability to apply line-by-line substitutions to a source text file,
producing an edited output file. It supports two kinds of edits:
Variable substitution, where ${var} is replaced by a value. sed-like
regular expression substitution.
This is from Community Plugins.