Avoid redundancy / reuse ModuleID in build.sbt and project/plugins.sbt? - sbt

I'd like to refer to the same version of a plugin in a build.sbt and in project/plugins.sbt. Both need to refer to e.g. val sbtGit = "com.typesafe.sbt" %% "sbt-git" % "1.0.0".
Adding it to project/Dependencies.scala and importing does not work for project/plugins.sbt.
Can I avoid the redundancy of specifying the sbt-git version twice?

A suggested solution in Sbt's Gitter is to use a symlink.

Related

SBT: describe external dependency relative to another (transitive) dependency?

To avoid jar hell, I'd like to refer to a dependency relatively.
For example, when I add a dependency to "org.http4s" %% "https-circe" % "0.21.1":
cs resolve org.http4s:http4s-circe_2.12:0.21.1 | grep -i circe ⎈ eks-cluster-eu-west-1-dev/master
io.circe:circe-core_2.12:0.13.0:default
I'd like to add a dependency to "circe-literal" in the version, which was automatically resolved by SBT's mediator. In this example "0.13.0". Is this possible?
On one hand, you could add circe-literal with a wildcard version, and using the latest-compatible conflict manager would get a version of it that is compatible with circe-core. Sadly, one cannot, without resorting to the coursier plugin, specify conflict managers for a specific artifact.
If that is ok, with you, however, you should be able to specify this:
conflictManager := ConflictManager.latestCompatible
libraryDependencies += "io.circe" %% "circe-literal % "[0,)"
You'll have to use the ivy resolver to get that working, though.
dependencyResolution := sbt.librarymanagement.ivy.IvyDependencyResolution(ivyConfiguration.value)
Using that, I got exactly what you wanted:
[info] [SUCCESSFUL ] io.circe#circe-literal_2.12;0.13.0!circe-literal_2.12.jar (304ms)

What does "provided->default" mean in an sbt build file?

An example of this comes from a sample github project:
libraryDependencies ++= Seq(
"javax.servlet" % "servlet-api" % "2.5" % "provided->default",
...
}
I'm only vaguely clear on what the 'fourth column' in these configurations mean, but this is the first time I've seen either provided or provided->default, and it's unclear how I can go about finding what should be expected here in the documentation. Can anyone help explain this construct?
It means that your provided configuration depends on the default configuration of "java.servlet" % "servlet-api" % "2.5".
Maven scopes describe what these configurations or scopes mean.
For instance, if you're using a library to write your tests, you've probably come across something like "org.scalacheck" %% "scalacheck" % "1.13.2" % "test" or similar. Here, the second part of the configuration is omitted and refers to the default configuration (usually compile). Equivalently, you could write "org.scalacheck" %% "scalacheck" % "1.13.2" % "test->compile". It means that your test configuration depends on the default configuration of ScalaCheck: your tests need ScalaCheck on the class path to compile and run.
You may find more details in the Ivy documentation.

Define configuration with compiler plugin

I'd like to create a compile configuration which is the same as the default one but adds a compiler plugin. In my particular case, I want to have a "dev" configuration but with the linter plugin (https://github.com/HairyFotr/linter) because it slows down compile times and there's no need to run it in production or continuous integration.
Now this is what I tried:
lazy val Dev = config("dev") extend Compile
lazy val root = (project in file(".")).configs(Dev).settings(
inConfig(Dev)(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12")): _*)
and it should work, since when I inspect dev:libraryDependencies, it's what I expect it to be- it has org.psywerx.hairyfotr:linter:0.1.12:plugin->default(compile). Normally if I add the library with a "plugin" scope, it does work for the default settings:
libraryDependencies += ("org.psywerx.hairyfotr" %% "linter" % "0.1.12" % "plugin"
It just does not work if I add this under a different configuration, so there must be something else going on here.
This solves the problem, but not exactly in a way was asked. Here's the full build.sbt:
libraryDependencies ++= Seq(
"org.psywerx.hairyfotr" %% "linter" % "0.1.14" % "test")
val linter = Command.command("linter")(state => {
val linterJar = for {
(newState, result) <- Project.runTask(fullClasspath in Test, state)
cp <- result.toEither.right.toOption
linter <- cp.find(
_.get(moduleID.key).exists(mId =>
mId.organization == "org.psywerx.hairyfotr" &&
mId.name == "linter_2.11"))
} yield linter.data.absolutePath
val res = Project.runTask(scalacOptions, state)
res match {
case Some((newState, result)) =>
result.toEither.right.foreach { defaultScalacOptions =>
Project.runTask(compile in Test,
Project.extract(state).append(
scalacOptions := defaultScalacOptions ++ linterJar.map(p => Seq(s"-Xplugin:$p")).getOrElse(Seq.empty),
newState))
}
case None => sys.error("Couldn't get defaultScalacOptions")
}
state
})
lazy val root = (project in file(".")).configs(Test).settings(commands ++= Seq(linter))
The fact that you return unmodified state means you don't change the project settings. So if you run sbt linter, you should get your project compiled with the additional scalacOptions, but if you run compile in the same sbt session, it will not use those additional settings.
The tricky thing here is that scalacOptions is actually a TaskKey, not a SettingKey. I don't know why is that, but to get its value, you have to run that task. One reason might be that in sbt you cannot make setting depending on a task, but you can make a task depending on a task. In other words, scalacOptions can depend on the other task value, and maybe internally it does, I haven't checked. If current answer will work for you, I can try and think about more elegant way of achieving the same result.
EDIT: modified the code to specify the scalacOptions for the linter plugin proper. Please note the plugin has to be a managed dependency, not just a downloaded jar, for this solution to work. If you want to have it unmanaged, there's a way, but I won't go into it for now. Additionally, I've taken a freedom of making it also work for testing code, for illustration purposes.
Looking at Defaults.scala in the source, it seems like the compile command is always taking the options from the compile scope. So if I'm correct, you can have only one set of compilation options!
This seems to be confirmed by the fact that scalacOptions behaves the same way, and this is also why I don't see a non-hacky answer for these similar questions:
Different scalac options for different scopes or tasks?
Different compile options for tests and release in SBT?
I'd be happy to be proven wrong.
EDIT: FWIW, one might not be able to define another scalac options profile in the same project, but you could do so in a "different" project:
lazy val dev = (project in file(".")).
settings(target := baseDirectory.value / "target" / "dev").
settings(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12"): _*)
This has the disadvantage that it has a separate output directory, so it will take more space and, more importantly, will not get incremental compiles between the two projects. However, after spending some time thinking about it, this may be by design. After all, even though linters don't, some scalac compilation options could conceivably change the output. This would make it meaningless to try to keep the metadata for incremental compilation from one set of scalac options to another. Thus different scalac options would indeed require different target directories.

How to force a specific version of dependency?

A dependency bar depends on foo 1.2.3, but that version of foo has a bug and I need to use version 1.2.2.
I can do that with force().
libraryDependencies += "foo" %% "foo" % "1.2.2" force()
That method is not recommended by the docs:
Forcing a revision (Not recommended)
Note: Forcing can create logical inconsistencies so it’s no longer recommended.
Does this mean SBT has a different, better way than force() to use a specific version of a dependency? If so, what?
Or am I to infer from the documentation that this entire problem is one that I'm recommended not to have?
you can use dependencyOverrides:
dependencyOverrides += "foo" %% "foo" % "1.2.2"
You're not avoiding "logical inconsistencies" anyway. If you force a version, you have to manually take care of compatibility with other libraries, there's no way out of that.
From the documentation:
Overriding a version
For binary compatible conflicts, sbt provides dependency overrides.
They are configured with the dependencyOverrides setting, which is a
set of ModuleIDs. For example, the following dependency definitions
conflict because spark uses log4j 1.2.16 and scalaxb uses log4j
1.2.17:
libraryDependencies ++= Seq(
"org.spark-project" %% "spark-core" % "0.5.1",
"org.scalaxb" %% "scalaxb" % "1.0.0" )
The default conflict manager chooses the latest revision of log4j, 1.2.17:
show update
[info] compile:
[info] log4j:log4j:1.2.17: ... ...
[info] (EVICTED) log4j:log4j:1.2.16 ...
To change the version
selected, add an override:
dependencyOverrides += "log4j" % "log4j" % "1.2.16"

Control which library dependency exported in SBT

I am looking for a way to control what library dependency exported, and what not. Something along those lines:
"org.slf4j" % "slf4j-api" % "1.7.6" doNotExport
or perhaps at the point where the project is imported, like this:
lazy val main = Project(appName, file("."), settings = buildSettings)
.dependsOn(ProjectRef(uri("../Utils"), "Utils").exceptLibraryDependency(organization="org.slf4j"))
Is there anything like this in SBT?
Well, it all depends on configurations. Default configurations expose dependencies again. So similar behavior can be achieved like this:
val compileOnly = config("compileOnly").hide
ivyConfigurations += compileOnly
unmanagedClasspath in Compile ++=
update.value.select(configurationFilter(compileOnly.name))
"org.slf4j" % "slf4j-api" % "1.7.6" % compileOnly
Note that this technique was descibed in an answer to Add a compile time only dependency in sbt.
This question should be closed as a duplicate, but the bounty prevents this.

Resources