How do I specify scope in build.sbt - sbt

I have a variable libDir defined in Global & when I try to use it inside one of the sub-projects (with specifying the scope), it fails with:
[info] Loading project definition from /Users/chichu/ws/tip/TestMain/project
References to undefined settings:
*/*:libDir from Streaming/*:install (/Users/chinchu/ws/tip/TestMain/build.sbt:124)
Did you mean TestRoot/*:libDir ?
Here's the snippet from build.sbt:
===
def Streaming(name: String, dir: String,
archiveName: String, main: String): Project = {
...
...
install := {
val jarName = assembly.value
sbt.IO.copyFile(jarName, (libDir in Global).value) // copy over the jar
}
}
..
..
lazy val installDir = SettingKeyFile
libDir := baseDirectory.value / "install/lib"
==
Why is it not able to resolve "libDir" even when I specify "in Global" ? I also tried "libDir in TestRoot" & it reports "error: not found: value TestRoot"
Thanks
-C

The specification says, that if you define a setting like this:
libDir := baseDirectory.value / "install/lib"
It will set a task and a configuration scope to Global, but a project scope will be set to current project (that is a root project in your case).
When you refer to your setting as libDir in Global, you set configuration scope to Global but still, you are in the current build, that is some project, that tries to define the install setting.
You should define your libDir like this:
libDir in Global := baseDirectory.value / "install/lib"
which will set also the project scope to Global.
Example build.sbt
lazy val installDir = settingKey[File]("ff")
lazy val install = taskKey[Unit]("prints install")
installDir in Global := baseDirectory.value / "install/lib"
val projectA = project.in(file("projectA")).settings(
install := {
val install = (installDir in Global).value
println(install)
}
)
val projectB = project
Alternative Solution
Alternatively you could give a root project an explicit id, that is in your main build.sbt add a line
val root = Project("root", file("."))
then in your install task refer to Global in that project (root is a name of the project)
(libDir in Global in root).value
Example build.sbt
lazy val installDir = settingKey[File]("ff")
lazy val install = taskKey[Unit]("prints install")
installDir := baseDirectory.value / "install/lib"
lazy val root = Project("root", file("."))
lazy val projectA = project.in(file("projectA")).settings(
install := {
val install = (installDir in Global in root).value
println(install)
}
)
lazy val projectB = project

Related

When using a Scala compiler plugin in sbt, how do you set a library dependency for the plugin?

I'm using a compiler plugin I wrote that depends on the Kyro serialization library. When attempting to use my plugin I set this up in build.sbt (top-level) like this:
lazy val dependencies =
new {
val munit = "org.scalameta" %% "munit" % "0.7.12" % Test
val kyro = "com.esotericsoftware" % "kryo" % "5.0.0-RC9"
}
lazy val commonDependencies = Seq(
dependencies.kyro,
dependencies.munit
)
lazy val root = (project in file("."))
.settings(
libraryDependencies ++= commonDependencies,
Test / parallelExecution := false
)
addCompilerPlugin("co.blocke" %% "dotty-reflection" % reflectionLibVersion)
But when I compile my target project, I get a java.lang.NoClassDefFoundError that it can't find Kyro. I've added kyro to my dependencies, but since this is for the compiler, not my app, it's not picking that up.
How can I properly tell sbt about a dependency my plugin needs?

How to use sbt packageBin with multiple modules in order

I have a sbt project with multiple sub modules which look like this:
--\ root
-- module 1
-- module 2
Using packageBin, I can get two zip files: module1.zip and module2.zip.
This is my build.sbt:
import Dependencies._
import NativePackagerHelper._
lazy val commonSettings = Seq(
organization := "com.zhyea.sbt",
version := "0.1-SNAPSHOT",
scalaVersion := "2.11.12",
exportJars := true,
artifactName := {
(sv: ScalaVersion, module: ModuleID, artifact: Artifact) => artifact.name + "." + artifact.extension
}
)
lazy val module2 = project.settings(commonSettings).settings()
.enablePlugins(JavaAppPackaging, UniversalPlugin)
.settings(libraryDependencies ++= module2Dependencies)
lazy val module1 = project.settings(commonSettings)
.enablePlugins(JavaAppPackaging, UniversalPlugin)
.settings(libraryDependencies ++= module1Dependencies)
lazy val root = project.in(file("."))
.settings(commonSettings)
.aggregate(module2, module1)
.enablePlugins(JavaAppPackaging, UniversalPlugin)
.dependsOn(module2, module1).configs()
mappings in Universal ++= directory("module2/target/universal")
The mappings in Universal ++= directory("module1/target/universal")
Now I want to execute the packageBin task at root and add sub module zips into root.zip.
The problem is that when the root module executes packageBin task, the sub modules' packageBin tasks haven't finished, and the root cannot get module1.zip and mudule2.zip.
How can I tell sbt to execute the packageBin task in order?
i just pack all sub modules' files into one zip by adding a new module named pack.

how to define multi-project build with sbt-native-packager that creates an RPM for each sub-project

In all of the examples I've seen regarding multi-module builds and sbt-native-packager, they all aggregate the sub-projects into a single package. I have sub-projects that each provide a micro-service. I believe that each of these should have it's own native package, but I don't see how to do that and have a one command build for all of the sub-projects.
This turns out be straightforward. Simply provide native-packager settings for each of the sub-projects that you want to package and don't provide any on the aggregating project.
I tested by modifying https://github.com/muuki88/sbt-native-packager-examples/tree/master/multi-module-build accordingly:
import NativePackagerKeys._
name := "mukis-fullstack"
// used like the groupId in maven
organization in ThisBuild := "de.mukis"
// all sub projects have the same version
version in ThisBuild := "1.0"
scalaVersion in ThisBuild := "2.11.2"
// common dependencies
libraryDependencies in ThisBuild ++= Seq(
"com.typesafe" % "config" % "1.2.0"
)
// this is the root project, aggregating all sub projects
lazy val root = Project(
id = "root",
base = file("."),
// configure your native packaging settings here
// settings = packageArchetype.java_server++ Seq(
// maintainer := "John Smith <john.smith#example.com>",
// packageDescription := "Fullstack Application",
// packageSummary := "Fullstack Application",
// entrypoint
// mainClass in Compile := Some("de.mukis.frontend.ProductionServer")
// ),
// always run all commands on each sub project
aggregate = Seq(frontend, backend, api)
) dependsOn(frontend, backend, api) // this does the actual aggregation
// --------- Project Frontend ------------------
lazy val frontend = Project(
id = "frontend",
base = file("frontend"),
settings = packageArchetype.java_server++ Seq(
maintainer := "John Smith <john.smith#example.com>",
packageDescription := "Frontend appplication",
mainClass in Compile := Some("de.mukis.frontend.ProductionServer")
)
) dependsOn(api)
// --------- Project Backend ----------------
lazy val backend = Project(
id = "backend",
base = file("backend"),
settings = packageArchetype.java_server++ Seq(
maintainer := "John Smith <john.smith#example.com>",
packageDescription := "Fullstack Application",
packageSummary := "Fullstack Application",
// entrypoint
mainClass in Compile := Some("de.mukis.frontend.ProductionServer")
)
) dependsOn(api)
// --------- Project API ------------------
lazy val api = Project(
id = "api",
base = file("api")
Results:
debian:packageBin
...misc messages elided...
[info] dpkg-deb: building package `frontend' in `../frontend_1.0_all.deb'.
[info] dpkg-deb: building package `backend' in `../backend_1.0_all.deb'.
For whom just ended up here, a more up-to-date answer could look like:
lazy val root = (project in file("."))
.aggregate(common, frontend, backend)
lazy val common = (project in file("common"))
lazy val frontend = (project in file("frontend"))
.enablePlugins(JavaServerAppPackaging)
lazy val backend = (project in file("backend"))
.dependsOn(common)
.enablePlugins(JavaAppPackaging)
.settings(javaPackagingSettings)
lazy val javaPackagingSettings = Seq(
// follow sbt-native-packager to identify settings you need
)
Description
Here is the scenario supporting the above configuration
Project root is the parent and we don't want to package it. It aggregates other subprojects.
Project common is a sort of library and also we don't want to package it
Project backend depends on common for the libraries.
Project frontend is a standalone project packaged as a Java server app with default configuration

How to setup sbt-native-packager in a single module project with multiple mains

I have a single module client-server project with a main for each.
I'm trying to use sbt-native-packager to generate start-script for both.
project/P.scala
object Tactic extends Build {
lazy val root =
(project in file(".")).
configs(Client, Server)
.settings( inConfig(Client)(Defaults.configTasks) : _*)
.settings( inConfig(Server)(Defaults.configTasks) : _*)
lazy val Client = config("client") extend Compile
lazy val Server = config("server") extend Compile
}
build.sbt
mainClass in Client := Some("myProject.Client")
mainClass in Server := Some("myProject.Server")
enablePlugins(JavaAppPackaging)
When I run client:stage the directory target/universal/stage/lib is created with all the necessary jars but the bin directory is missing. What am I doing wrong?
Subsidiary question: what is the key to set the starting script name?
I would recommend setting up your project as a multi-module build, instead of creating and using new configurations. I tried your multiple configuration route and it gets hairy very quickly.
For example (I created a shared project for anything shared between client & server):
def commonSettings(module: String) = Seq[Setting[_]](
organization := "org.tactic",
name := s"tactic-$module",
version := "1.0-SNAPSHOT",
scalaVersion := "2.11.6"
)
lazy val root = (project in file(".")
settings(commonSettings("root"))
dependsOn (shared, client, server)
aggregate (shared, client, server)
)
val shared = (project
settings(commonSettings("shared"))
)
val client = (project
settings(commonSettings("client"))
enablePlugins JavaAppPackaging
dependsOn shared
)
val server = (project
settings(commonSettings("server"))
enablePlugins JavaAppPackaging
dependsOn shared
)
Note I'm enabling sbt-native-packager's JavaAppPackaging in the client and server.
Then run stage.
Also, the key for the starting script name is executableScriptName.

sbt: run task on subproject

I have the following project structure:
lazy val root = project.aggregate(rest,backend)
lazy val rest = project
lazy val backend = project
When I execute the "run" task from the parent, I want a specific class from the "backend" project to have its main method executed. How would I accomplish this?
lazy val root = project.aggregate(rest,backend).dependsOn(rest,backend) //<- don't forget dependsOn
lazy val rest = project
lazy val backend = project.settings(mainClass in (Compile, run) := Some("fully.qualified.path.to.MainClass"))
run in Compile <<= (run in Compile in backend)

Resources