Execute sbt task before packaging of fat-jar - sbt

I wrote a small sbt plugin for some resource files editing in project's target directory (actually, it just works similary to maven profiles). Now, when I wrote and tested my simple custom sbt task (let's call it interpolateParameters), I want it to be executed between resource copying and jar creation when running sbt assembly. However, I can't find any documentation about which tasks are executed "under the hood" of assembly task provided by sbt-assembly plugin. And actually I doubt is it even possible.
Therefore, I have 2 questions: is it possible to somehow execute my task between sbt assembly's compile + copyResources and "create jar" steps? And if not, is there a way to achieve what I want without creating my own fork of sbt-assembly plugin?

I solved this with making assembly depends on my task interpolateParameters, and interpolateParameters depends on products. Here is part of my resulting build.sbt file with solution:
lazy val some<oduleForFatJar = (project in file("some/path"))
.dependsOn(
someOtherModule % "test->test;compile->compile"
)
.settings(
name := "some module name",
sharedSettings,
libraryDependencies ++= warehouseDependencies,
mainClass in assembly := Some("com.xxxx.yyyy.Zzzz"),
assemblyJarName in assembly := s"some_module-${version.value}.jar",
assembly := {
assembly dependsOn(interpolateParameters) value
},
interpolateParameters := {
interpolateParameters dependsOn(products) value
},
(test in assembly) := {}
)
Hope it can help someone.

Related

How to publish an artifact with pom-packaging in SBT?

I have a multi-project build in SBT where some projects should aggregate dependencies and contain no code. So then clients could depend on these projects as a single dependency instead of directly depending on all of their aggregated dependencies. With Maven, this is a common pattern, e.g. when using Spring Boot.
In SBT, I figured I can suppress the generation of the empty artifacts by adding this setting to these projects:
packagedArtifacts := Classpaths.packaged(Seq(makePom)).value
However, the makePom task writes <packaging>jar</packaging> in the generated POM. But now that there is no JAR anymore, this should read <packaging>pom</packaging> instead.
How can I do this?
This question is a bit old, but I just came across the same issue and found a solution. The original answer does point to the right page where this info can be found, but here is an example. It uses the pomPostProcess setting to transform the generated POM right before it is written to disk. Essentially, we loop over all the XML nodes, looking for the element we care about and then rewrite it.
import scala.xml.{Node => XmlNode, NodeSeq => XmlNodeSeq, _}
import scala.xml.transform._
pomPostProcess := { node: XmlNode =>
val rule = new RewriteRule {
override def transform(n: XmlNode): XmlNodeSeq = n match {
case e: Elem if e != null && e.label == "packaging" =>
<packaging>pom</packaging>
case _ => n
}
}
new RuleTransformer(rule).transform(node).head
},
Maybe you could modify the result pom as described here: Modifying the generated POM
You can disable publishing the default artifacts of JAR, sources, and docs, then opt in explicitly to publishing the POM. sbt produces and publishes a POM only, with <packaging>pom</packaging>.
// This project has no sources, I want <packaging>pom</pom> with dependencies
lazy val bundle = project
.dependsOn(moduleA, moduleB)
.settings(
publishArtifact := false, // Disable jar, sources, docs
publishArtifact in makePom := true,
)
lazy val moduleA = project
lazy val moduleB = project
lazy val moduleC = project
Run sbt bundle/publishM2 to verify the POM in ~/.m2/repository.
I dare say this is almost intuitive, a rare moment of pleasant surprise with sbt 😅
I confirmed this with current sbt 1.3.9, and 1.0.1, the oldest launcher I happen to have installed on my machine.
The Artifacts page in the reference docs may be helpful, perhaps this trick should be added there.

sbt aspectj with native packager

I'm attempting to use the sbt-aspectj plugin with the sbt native packager and am running into an issue where the associated -javaagent path to the aspectj load time weaver jar references an ivy cache location rather than something packaged.
That is, after running sbt stage, executing the staged application via bash -x target/universal/stage/bin/myapp/ results in this javaagent:
exec java -javaagent:/home/myuser/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.10.jar -cp /home/myuser/myproject/target/universal/stage/lib/org.aspectj.aspectjweaver-1.8.10.jar:/home/myuser/myproject/target/universal/stage/lib/otherlibs.jar myorg.MyMainApp args
My target platform is Heroku where the artifacts are built before being effectively 'pushed' out to individual 'dynos' (very analogous to a docker setup). The issue here is that the resulting -javaagent path was valid on the machine in which the 'staged' deployable was built, but will not exist where it's ultimately run.
How can one configure the sbt-aspectj plugin to reference a packaged lib rather than one from the ivy cache?
Current configuration:
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.10.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.5")
build.sbt (selected parts):
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
javaOptions in Runtime ++= { AspectjKeys.weaverOptions in Aspectj }.value,
// see: https://github.com/sbt/sbt-native-packager/issues/598#issuecomment-111584866
javaOptions in Universal ++= { AspectjKeys.weaverOptions in Aspectj }.value
.map { "-J" + _ },
fork in run := true
)
Update
I've tried several approaches including pulling the relevant output for javaOptions from existing mappings, but the result is a cyclical dependency error thrown by sbt.
I have something that technically solves my problem but feels unsatisfactory. As of now, I'm including an aspectjweaver dependency directly and using the sbt-native-packager concept of bashScriptExtraDefines to append an appropriate javaagent:
updated build.sbt:
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
bashScriptExtraDefines += scriptClasspath.value
.filter(_.contains("aspectjweaver"))
.headOption
.map("addJava -javaagent:${lib_dir}/" + _)
.getOrElse(""),
fork in run := true
)
You can add the following settings in your sbt config:
.settings(
retrieveManaged := true,
libraryDependencies += "org.aspectj" % "aspectjweaver" % aspectJWeaverV)
AspectJ weaver JAR will be copied to ./lib_managed/jars/org.aspectj/aspectjweaver/aspectjweaver-[aspectJWeaverV].jar in your project root.
I actually solved this by using the sbt-javaagent plugin to adding agents to the runtime

Get scrooge to generate source files in test phase?

I have a multi module build that looks kind of like:
lazy val root = (project in file(".")).
settings(common).
aggregate(finagle_core, finagle_thrift)
lazy val finagle_core =
project.
settings(common).
settings(Seq(
name := "finagle-core",
libraryDependencies ++= Dependencies.finagle
))
lazy val finagle_thrift =
project.
settings(common).
settings(Seq(
name := "finagle-thrift",
libraryDependencies ++= Dependencies.finagleThrift,
scroogeThriftSourceFolder in Test <<= baseDirectory {
base => {
base / "target/thrift_external/"
}
},
scroogeThriftDependencies in Test := Seq(
"external-client"
),
scroogeBuildOptions in Test := Seq(
WithFinagle
)
)).dependsOn(finagle_core)
Where finagle_thrift has a dependency on a jar file external-client that has thrift files in it. I want it to extract the thrift files to target/thrift_external and compile the thrift files into a client.
This does work, however I have to execute sbt twice to get it to work. The first time I run sbt, it doesn't extract the files. The second time it does. I am at a loss as to why that is happening.
==
EDIT:
I see whats happening. It does unpack the dependencies on test, however because the settings are evaluated before the unpack, the generated code doesn't get the list of files that are generated. The second time it runs, its already extracted so it picks up the thrift files
==
EDIT 2:
I solved this in a super janky way:
addCommandAlias("build", "; test:scroogeUnpackDeps; compile")
And now it gets unpacked first, then compiled
SBT resolves the scroogeThriftSourceFolder directory when it loads (before running the tasks) at which point the external files are not there yet.
Performing a reload will make it discover the downloaded files:
sbt scroogeUnpackDeps reload compile

How to successfully import a CrossProject sbt build into eclipse using sbt-eclipse

I've used sbt-eclipse in the past to successfully import a simple sbt project into eclipse. I'm now trying to leverage the CrossProject mechanism of sbt to use the Scala-JS environment (makes 2 subprojects in sbt--one for Javascript and one for JVM code). The recommendation (see SBT docs link here) is to add the setting 'EclipseKeys.useProjectId := true' in the build.sbt file to support importing (now) 2 projects into one eclipse project. I then give the 'eclipse' command in a running SBT session to create my eclipse project and then launch eclipse and attempt to import this new project. When I do this, the import dialog wizard in eclipse does show me two sub-projects, but when I try to finish the import, eclipse complains that the project already exists and I get two strange looking links in my eclipse project that seem to do nothing.
What is the correct procedure for getting a CrossProject sbt build into eclipse?
Ok, so it seems eclipse did not like that I had only one 'name' for the project that was in the shared settings area of the build.sbt I had this:
lazy val sp = crossProject.in(file(".")).
settings(
version := "0.1",
name := "SJSTut",
scalaVersion := "2.11.7"
).
jvmSettings(
// Add JVM-specific settings here
libraryDependencies ++= Seq(...)
).
jsSettings(
// Add JS-specific settings here
libraryDependencies ++= Seq(...)
)
and what I should have done was this:
lazy val sp = crossProject.in(file(".")).
settings(
version := "0.1",
scalaVersion := "2.11.7"
).
jvmSettings(
// Add JVM-specific settings here
name := "SJSTutJVM",
libraryDependencies ++= Seq(...)
).
jsSettings(
// Add JS-specific settings here
name := "SJSTutJS",
libraryDependencies ++= Seq(...)
)
Note the removal of the 'name' assignment from settings and instead, placements into both the jvmSettings and jsSettings area with uniquely different names.
Now I'm able to pull this into eclipse (as 2 separate projects). If anyone else has a better setup, I'd love to hear about it.

Adding /etc/<application> to the classpath in sbt-native-packager for debian:package-bin

So I'm using the packageArchetype.java_server and setup my mappings so the files from "src/main/resources" go into my "/etc/" folder in the debian package. I'm using "sbt debian:package-bin" to create the package
The trouble is when I use "sbt run" it picks up the src/main/resources from the classpath. What's the right way to get the sbt-native-packager to give /etc/ as a resource classpath for my configuration and logging files?
plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-M2")
build.sbt
...
packageArchetype.java_server
packageDescription := "Some Description"
packageSummary := "My App Daemon"
maintainer := "Me<me#example.org>"
mappings in Universal ++= Seq(
file("src/main/resources/application.conf") -> "conf/application.conf",
file("src/main/resources/logback.xml") -> "conf/logback.xml"
)
....
I took a slightly different approach. Since sbt-native-packager keeps those two files (application.conf and logback.xml) in my package distribution jar file, I really just wanted a way to overwrite (or merge) these files from /etc. I kept the two mappings above and just added the following:
src/main/templates/etc-default:
-Dmyapplication.config=/etc/${{app_name}}/application.conf
-Dlogback.configurationFile=/etc/${{app_name}}/logback.xml
Then within my code (using Typesafe Config Libraries):
lazy val baseConfig = ConfigFactory.load //defaults from src/resources
//For use in Debain packaging script. (see etc-default)
val systemConfig = Option(System.getProperty("myapplication.config")) match {
case Some(cfile) => ConfigFactory.parseFile(new File(cfile)).withFallback(baseConfig)
case None => baseConfig
}
And of course -Dlogback.configuration is a system propety used by Logback.

Resources