I have a multi module build that looks kind of like:
lazy val root = (project in file(".")).
settings(common).
aggregate(finagle_core, finagle_thrift)
lazy val finagle_core =
project.
settings(common).
settings(Seq(
name := "finagle-core",
libraryDependencies ++= Dependencies.finagle
))
lazy val finagle_thrift =
project.
settings(common).
settings(Seq(
name := "finagle-thrift",
libraryDependencies ++= Dependencies.finagleThrift,
scroogeThriftSourceFolder in Test <<= baseDirectory {
base => {
base / "target/thrift_external/"
}
},
scroogeThriftDependencies in Test := Seq(
"external-client"
),
scroogeBuildOptions in Test := Seq(
WithFinagle
)
)).dependsOn(finagle_core)
Where finagle_thrift has a dependency on a jar file external-client that has thrift files in it. I want it to extract the thrift files to target/thrift_external and compile the thrift files into a client.
This does work, however I have to execute sbt twice to get it to work. The first time I run sbt, it doesn't extract the files. The second time it does. I am at a loss as to why that is happening.
==
EDIT:
I see whats happening. It does unpack the dependencies on test, however because the settings are evaluated before the unpack, the generated code doesn't get the list of files that are generated. The second time it runs, its already extracted so it picks up the thrift files
==
EDIT 2:
I solved this in a super janky way:
addCommandAlias("build", "; test:scroogeUnpackDeps; compile")
And now it gets unpacked first, then compiled
SBT resolves the scroogeThriftSourceFolder directory when it loads (before running the tasks) at which point the external files are not there yet.
Performing a reload will make it discover the downloaded files:
sbt scroogeUnpackDeps reload compile
Related
I wrote a small sbt plugin for some resource files editing in project's target directory (actually, it just works similary to maven profiles). Now, when I wrote and tested my simple custom sbt task (let's call it interpolateParameters), I want it to be executed between resource copying and jar creation when running sbt assembly. However, I can't find any documentation about which tasks are executed "under the hood" of assembly task provided by sbt-assembly plugin. And actually I doubt is it even possible.
Therefore, I have 2 questions: is it possible to somehow execute my task between sbt assembly's compile + copyResources and "create jar" steps? And if not, is there a way to achieve what I want without creating my own fork of sbt-assembly plugin?
I solved this with making assembly depends on my task interpolateParameters, and interpolateParameters depends on products. Here is part of my resulting build.sbt file with solution:
lazy val some<oduleForFatJar = (project in file("some/path"))
.dependsOn(
someOtherModule % "test->test;compile->compile"
)
.settings(
name := "some module name",
sharedSettings,
libraryDependencies ++= warehouseDependencies,
mainClass in assembly := Some("com.xxxx.yyyy.Zzzz"),
assemblyJarName in assembly := s"some_module-${version.value}.jar",
assembly := {
assembly dependsOn(interpolateParameters) value
},
interpolateParameters := {
interpolateParameters dependsOn(products) value
},
(test in assembly) := {}
)
Hope it can help someone.
I have a PlayFramework 2.7 application which is build by sbt.
For accessing the database, I am using JOOQ. As you know, JOOQ reads my database schema and generates the Java source classes, which then are used in my application.
The application only compiles, if the database classes are present.
I am generating the classes with this custom sbt task:
// https://mvnrepository.com/artifact/org.jooq/jooq-meta
libraryDependencies += "org.jooq" % "jooq-meta" % "3.12.1"
lazy val generateJOOQ = taskKey[Seq[File]]("Generate JooQ classes")
generateJOOQ := {
(runner in Compile).value.run("org.jooq.codegen.GenerationTool",
(fullClasspath in Compile).value.files,
Array("conf/db.conf.xml"),
streams.value.log).failed foreach (sys error _.getMessage)
((sourceManaged.value / "main/generated") ** "*.java").get
}
I googled around and found the script above and modified it a little bit according to my needs, but I do not really understand it, since sbt/scala are new to me.
The problem now is, when I start the generateJOOQ, sbt tries to compile the project first, which fails, because the database classes are missing. So what I have to do is, comment all code out which uses the generated classes, execute the task which compiles my project, generates the database classes and then enable the commented out code again. This is a pain!
I guess the problem is the command (runner in Compile) but I did not find any possibility to execute my custom task WITHOUT compiling first.
Please help! Thank you!
Usually, when you want to generate sources, you should use a source generator, see https://www.scala-sbt.org/1.x/docs/Howto-Generating-Files.html
sourceGenerators in Compile += generateJOOQ
Doing that automatically causes SBT to execute your task first before trying to compile the Scala/Java sources.
Then, you should not really use the runner task, since that is used to run your project, which depends on the compile task, which needs to execute first of course.
You should add the jooq-meta library as a dependeny of the build, not of your sources. That means you should add the libraryDependencies += "org.jooq" % "jooq-meta" % "3.12.1" line e.g. to project/jooq.sbt.
Then, you can simply call the GenerationTool of jooq just as usually in your task:
// build.sbt
generateJOOQ := {
org.jooq.codegen.GenerationTool.main(Array("conf/db.conf.xml"))
((sourceManaged.value / "main/generated") ** "*.java").get
}
I'd like to create an SBT project with inheritance and shared dependencies.
With Maven's POM files, there is the idea of Project Inheritance where you can set a parent project. I'd like to do the same thing with SBT.
The xchange-stream library uses Maven's Project Inheritance to resolve subproject dependencies when compiled from the parent project.
Here is my idea of what the file structure would look like:
sbt-project/
project/
dependencies.scala # Contains dependencies common to all projects
build.sbt # Contains definition of parent project with references
# to subprojects
subproject1/
build.sbt # Contains `subproject3` as a dependency
subproject2/
build.sbt # Contains `subproject3` as a dependency
subproject3/
build.sbt # Is a dependency for `subproject1` and `subproject2`
Where project1 and project2 can include project3 in their dependencies lists like this:
libraryDependencies ++= "tld.organization" % "project3" % "1.0.0"
Such that when subproject1 or subproject2 are compiled by invoking sbt compile from within their subdirectories, or when the parent: sbt-project is compiled from the main sbt-project directory, then subproject3 will be compiled and published locally with SBT, or otherwise be made available to the projects that need it.
Also, how would shared dependencies be specified in sbt-project/build.sbt or anywhere in the sbt-project/project directory, such that they are useable within subproject1 and subproject2, when invoking sbt compile within those subdirectories?
The following examples don't help answer either of the above points:
jbruggem/sbt-multiproject-example:
Uses recursive build.sbt files, but doesn't share dependencies among child projects.
Defining Multi-project Builds with sbt: pbassiner/sbt-multi-project-example:
Uses a single build.sbt file for the projects in their subdirectories.
sachabarber/SBT_MultiProject_Demo:
Uses a single build.sbt file.
Such that when subproject1 or subproject2 are compiled by invoking sbt compile from within their subdirectories...
Maybe Maven is meant to be used together with the shell environment and cd command, but that's not how sbt works at least as of sbt 1.x in 2019.
The sbt way is to use sbt as an interactive shell, and start it at the top level. You can then either invoke compilation as subproject1/compile, or switch into it using project subproject1, and call compile in there.
house plugin
A feature similar to parent POM would be achieved by creating a custom plugin.
package com.example
import sbt._
import Keys._
object FooProjectPlugin extends AutoPlugin {
override def requires = plugins.JvmPlugin
val commonsIo = "commons-io" % "commons-io" % "2.6"
override def buildSettings: Seq[Def.Setting[_]] = Seq(
organization := "com.example"
)
override def projectSettings: Seq[Def.Setting[_]] = Seq(
libraryDependencies += commonsIo
)
}
sbt-sriracha
It's not exactly what you are asking for, but I have an experimental plugin that allows you to switch between source dependency and binary dependency. See hot source dependencies using sbt-sriracha.
Using that you could create three individual sbt builds for project1, project2, and project3, all located inside $HOME/workspace directory.
ThisBuild / scalaVersion := "2.12.8"
ThisBuild / version := "0.1.1-SNAPSHOT"
lazy val project3Ref = ProjectRef(workspaceDirectory / "project3", "project3")
lazy val project3Lib = "tld.organization" %% "project3" % "0.1.0"
lazy val project1 = (project in file("."))
.enablePlugins(FooProjectPlugin)
.sourceDependency(project3Ref, project3Lib)
.settings(
name := "project1"
)
With this setup, you can launch sbt -Dsbt.sourcemode=true and it will pick up project3 as a subproject.
You can use Mecha super-repo concept. Take a look on the setup and docs here: https://github.com/storm-enroute/mecha
The basic idea is that you can combine dependent sbt projects (with their own build.sbt) under single root super-repo sbt project:
/root
/project/plugins.sbt
repos.conf
/project1
/src/..
/project/plugins.sbt
build.sbt
/project2
/src/..
/project/plugins.sbt
build.sbt
Please, note that there is no build.sbt in the root folder!
Instead there is repos.conf file. It contains definition of the sub-repos and looks like the folowing:
root {
dir = "."
origin = ""
mirrors = []
}
project1 {
dir = "project1"
origin = "git#github.com:some_user/project1.git"
mirrors = []
}
project2 {
dir = "project2"
origin = "git#github.com:some_user/project2.git"
mirrors = []
}
Then you can specify the Inter-Project, source-level Dependencies within individual projects.
There are two approaches:
dependencies.conf file
or in the build source code
For more details, please, see the docs
I'm attempting to use the sbt-aspectj plugin with the sbt native packager and am running into an issue where the associated -javaagent path to the aspectj load time weaver jar references an ivy cache location rather than something packaged.
That is, after running sbt stage, executing the staged application via bash -x target/universal/stage/bin/myapp/ results in this javaagent:
exec java -javaagent:/home/myuser/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.10.jar -cp /home/myuser/myproject/target/universal/stage/lib/org.aspectj.aspectjweaver-1.8.10.jar:/home/myuser/myproject/target/universal/stage/lib/otherlibs.jar myorg.MyMainApp args
My target platform is Heroku where the artifacts are built before being effectively 'pushed' out to individual 'dynos' (very analogous to a docker setup). The issue here is that the resulting -javaagent path was valid on the machine in which the 'staged' deployable was built, but will not exist where it's ultimately run.
How can one configure the sbt-aspectj plugin to reference a packaged lib rather than one from the ivy cache?
Current configuration:
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.10.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.5")
build.sbt (selected parts):
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
javaOptions in Runtime ++= { AspectjKeys.weaverOptions in Aspectj }.value,
// see: https://github.com/sbt/sbt-native-packager/issues/598#issuecomment-111584866
javaOptions in Universal ++= { AspectjKeys.weaverOptions in Aspectj }.value
.map { "-J" + _ },
fork in run := true
)
Update
I've tried several approaches including pulling the relevant output for javaOptions from existing mappings, but the result is a cyclical dependency error thrown by sbt.
I have something that technically solves my problem but feels unsatisfactory. As of now, I'm including an aspectjweaver dependency directly and using the sbt-native-packager concept of bashScriptExtraDefines to append an appropriate javaagent:
updated build.sbt:
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
bashScriptExtraDefines += scriptClasspath.value
.filter(_.contains("aspectjweaver"))
.headOption
.map("addJava -javaagent:${lib_dir}/" + _)
.getOrElse(""),
fork in run := true
)
You can add the following settings in your sbt config:
.settings(
retrieveManaged := true,
libraryDependencies += "org.aspectj" % "aspectjweaver" % aspectJWeaverV)
AspectJ weaver JAR will be copied to ./lib_managed/jars/org.aspectj/aspectjweaver/aspectjweaver-[aspectJWeaverV].jar in your project root.
I actually solved this by using the sbt-javaagent plugin to adding agents to the runtime
So I'm using the packageArchetype.java_server and setup my mappings so the files from "src/main/resources" go into my "/etc/" folder in the debian package. I'm using "sbt debian:package-bin" to create the package
The trouble is when I use "sbt run" it picks up the src/main/resources from the classpath. What's the right way to get the sbt-native-packager to give /etc/ as a resource classpath for my configuration and logging files?
plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-M2")
build.sbt
...
packageArchetype.java_server
packageDescription := "Some Description"
packageSummary := "My App Daemon"
maintainer := "Me<me#example.org>"
mappings in Universal ++= Seq(
file("src/main/resources/application.conf") -> "conf/application.conf",
file("src/main/resources/logback.xml") -> "conf/logback.xml"
)
....
I took a slightly different approach. Since sbt-native-packager keeps those two files (application.conf and logback.xml) in my package distribution jar file, I really just wanted a way to overwrite (or merge) these files from /etc. I kept the two mappings above and just added the following:
src/main/templates/etc-default:
-Dmyapplication.config=/etc/${{app_name}}/application.conf
-Dlogback.configurationFile=/etc/${{app_name}}/logback.xml
Then within my code (using Typesafe Config Libraries):
lazy val baseConfig = ConfigFactory.load //defaults from src/resources
//For use in Debain packaging script. (see etc-default)
val systemConfig = Option(System.getProperty("myapplication.config")) match {
case Some(cfile) => ConfigFactory.parseFile(new File(cfile)).withFallback(baseConfig)
case None => baseConfig
}
And of course -Dlogback.configuration is a system propety used by Logback.