To avoid jar hell, I'd like to refer to a dependency relatively.
For example, when I add a dependency to "org.http4s" %% "https-circe" % "0.21.1":
cs resolve org.http4s:http4s-circe_2.12:0.21.1 | grep -i circe ⎈ eks-cluster-eu-west-1-dev/master
io.circe:circe-core_2.12:0.13.0:default
I'd like to add a dependency to "circe-literal" in the version, which was automatically resolved by SBT's mediator. In this example "0.13.0". Is this possible?
On one hand, you could add circe-literal with a wildcard version, and using the latest-compatible conflict manager would get a version of it that is compatible with circe-core. Sadly, one cannot, without resorting to the coursier plugin, specify conflict managers for a specific artifact.
If that is ok, with you, however, you should be able to specify this:
conflictManager := ConflictManager.latestCompatible
libraryDependencies += "io.circe" %% "circe-literal % "[0,)"
You'll have to use the ivy resolver to get that working, though.
dependencyResolution := sbt.librarymanagement.ivy.IvyDependencyResolution(ivyConfiguration.value)
Using that, I got exactly what you wanted:
[info] [SUCCESSFUL ] io.circe#circe-literal_2.12;0.13.0!circe-literal_2.12.jar (304ms)
Related
I'd like to refer to the same version of a plugin in a build.sbt and in project/plugins.sbt. Both need to refer to e.g. val sbtGit = "com.typesafe.sbt" %% "sbt-git" % "1.0.0".
Adding it to project/Dependencies.scala and importing does not work for project/plugins.sbt.
Can I avoid the redundancy of specifying the sbt-git version twice?
A suggested solution in Sbt's Gitter is to use a symlink.
I'd like to create a compile configuration which is the same as the default one but adds a compiler plugin. In my particular case, I want to have a "dev" configuration but with the linter plugin (https://github.com/HairyFotr/linter) because it slows down compile times and there's no need to run it in production or continuous integration.
Now this is what I tried:
lazy val Dev = config("dev") extend Compile
lazy val root = (project in file(".")).configs(Dev).settings(
inConfig(Dev)(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12")): _*)
and it should work, since when I inspect dev:libraryDependencies, it's what I expect it to be- it has org.psywerx.hairyfotr:linter:0.1.12:plugin->default(compile). Normally if I add the library with a "plugin" scope, it does work for the default settings:
libraryDependencies += ("org.psywerx.hairyfotr" %% "linter" % "0.1.12" % "plugin"
It just does not work if I add this under a different configuration, so there must be something else going on here.
This solves the problem, but not exactly in a way was asked. Here's the full build.sbt:
libraryDependencies ++= Seq(
"org.psywerx.hairyfotr" %% "linter" % "0.1.14" % "test")
val linter = Command.command("linter")(state => {
val linterJar = for {
(newState, result) <- Project.runTask(fullClasspath in Test, state)
cp <- result.toEither.right.toOption
linter <- cp.find(
_.get(moduleID.key).exists(mId =>
mId.organization == "org.psywerx.hairyfotr" &&
mId.name == "linter_2.11"))
} yield linter.data.absolutePath
val res = Project.runTask(scalacOptions, state)
res match {
case Some((newState, result)) =>
result.toEither.right.foreach { defaultScalacOptions =>
Project.runTask(compile in Test,
Project.extract(state).append(
scalacOptions := defaultScalacOptions ++ linterJar.map(p => Seq(s"-Xplugin:$p")).getOrElse(Seq.empty),
newState))
}
case None => sys.error("Couldn't get defaultScalacOptions")
}
state
})
lazy val root = (project in file(".")).configs(Test).settings(commands ++= Seq(linter))
The fact that you return unmodified state means you don't change the project settings. So if you run sbt linter, you should get your project compiled with the additional scalacOptions, but if you run compile in the same sbt session, it will not use those additional settings.
The tricky thing here is that scalacOptions is actually a TaskKey, not a SettingKey. I don't know why is that, but to get its value, you have to run that task. One reason might be that in sbt you cannot make setting depending on a task, but you can make a task depending on a task. In other words, scalacOptions can depend on the other task value, and maybe internally it does, I haven't checked. If current answer will work for you, I can try and think about more elegant way of achieving the same result.
EDIT: modified the code to specify the scalacOptions for the linter plugin proper. Please note the plugin has to be a managed dependency, not just a downloaded jar, for this solution to work. If you want to have it unmanaged, there's a way, but I won't go into it for now. Additionally, I've taken a freedom of making it also work for testing code, for illustration purposes.
Looking at Defaults.scala in the source, it seems like the compile command is always taking the options from the compile scope. So if I'm correct, you can have only one set of compilation options!
This seems to be confirmed by the fact that scalacOptions behaves the same way, and this is also why I don't see a non-hacky answer for these similar questions:
Different scalac options for different scopes or tasks?
Different compile options for tests and release in SBT?
I'd be happy to be proven wrong.
EDIT: FWIW, one might not be able to define another scalac options profile in the same project, but you could do so in a "different" project:
lazy val dev = (project in file(".")).
settings(target := baseDirectory.value / "target" / "dev").
settings(addCompilerPlugin("org.psywerx.hairyfotr" %% "linter" % "0.1.12"): _*)
This has the disadvantage that it has a separate output directory, so it will take more space and, more importantly, will not get incremental compiles between the two projects. However, after spending some time thinking about it, this may be by design. After all, even though linters don't, some scalac compilation options could conceivably change the output. This would make it meaningless to try to keep the metadata for incremental compilation from one set of scalac options to another. Thus different scalac options would indeed require different target directories.
A dependency bar depends on foo 1.2.3, but that version of foo has a bug and I need to use version 1.2.2.
I can do that with force().
libraryDependencies += "foo" %% "foo" % "1.2.2" force()
That method is not recommended by the docs:
Forcing a revision (Not recommended)
Note: Forcing can create logical inconsistencies so it’s no longer recommended.
Does this mean SBT has a different, better way than force() to use a specific version of a dependency? If so, what?
Or am I to infer from the documentation that this entire problem is one that I'm recommended not to have?
you can use dependencyOverrides:
dependencyOverrides += "foo" %% "foo" % "1.2.2"
You're not avoiding "logical inconsistencies" anyway. If you force a version, you have to manually take care of compatibility with other libraries, there's no way out of that.
From the documentation:
Overriding a version
For binary compatible conflicts, sbt provides dependency overrides.
They are configured with the dependencyOverrides setting, which is a
set of ModuleIDs. For example, the following dependency definitions
conflict because spark uses log4j 1.2.16 and scalaxb uses log4j
1.2.17:
libraryDependencies ++= Seq(
"org.spark-project" %% "spark-core" % "0.5.1",
"org.scalaxb" %% "scalaxb" % "1.0.0" )
The default conflict manager chooses the latest revision of log4j, 1.2.17:
show update
[info] compile:
[info] log4j:log4j:1.2.17: ... ...
[info] (EVICTED) log4j:log4j:1.2.16 ...
To change the version
selected, add an override:
dependencyOverrides += "log4j" % "log4j" % "1.2.16"
I am looking for a way to control what library dependency exported, and what not. Something along those lines:
"org.slf4j" % "slf4j-api" % "1.7.6" doNotExport
or perhaps at the point where the project is imported, like this:
lazy val main = Project(appName, file("."), settings = buildSettings)
.dependsOn(ProjectRef(uri("../Utils"), "Utils").exceptLibraryDependency(organization="org.slf4j"))
Is there anything like this in SBT?
Well, it all depends on configurations. Default configurations expose dependencies again. So similar behavior can be achieved like this:
val compileOnly = config("compileOnly").hide
ivyConfigurations += compileOnly
unmanagedClasspath in Compile ++=
update.value.select(configurationFilter(compileOnly.name))
"org.slf4j" % "slf4j-api" % "1.7.6" % compileOnly
Note that this technique was descibed in an answer to Add a compile time only dependency in sbt.
This question should be closed as a duplicate, but the bounty prevents this.
We are moving into Scala/SBT from a Java/Gradle stack. Our gradle builds were leveraging a task called processResources and some Ant filter thing named ReplaceTokens to dynamically replace tokens in a checked-in .properties file without actually changing the .properties file (just changing the output). The gradle task looks like:
processResources {
def whoami = System.getProperty( 'user.name' );
def hostname = InetAddress.getLocalHost().getHostName()
def buildTimestamp = new Date().format('yyyy-MM-dd HH:mm:ss z')
filter ReplaceTokens, tokens: [
"buildsig.version" : project.version,
"buildsig.classifier" : project.classifier,
"buildsig.timestamp" : buildTimestamp,
"buildsig.user" : whoami,
"buildsig.system" : hostname,
"buildsig.tag" : buildTag
]
}
This task locates all the template files in the src/main/resources directory, performs the requisite substitutions and outputs the results at build/resources/main. In other words it transforms src/main/resources/buildsig.properties from...
buildsig.version=#buildsig.version#
buildsig.classifier=#buildsig.classifier#
buildsig.timestamp=#buildsig.timestamp#
buildsig.user=#buildsig.user#
buildsig.system=#buildsig.system#
buildsig.tag=#buildsig.tag#
...to build/resources/main/buildsig.properties...
buildsig.version=1.6.5
buildsig.classifier=RELEASE
buildsig.timestamp=2013-05-06 09:46:52 PDT
buildsig.user=jenkins
buildsig.system=bobk-mbp.local
buildsig.tag=dev
Which, ultimately, finds its way into the WAR file at WEB-INF/classes/buildsig.properties. This works like a champ to record build specific information in a Properties file which gets loaded from the classpath at runtime.
What do I do in SBT to get something like this done? I'm new to Scala / SBT so please forgive me if this seems a stupid question. At the end of the day what I need is a means of pulling some information from the environment on which I build and placing that information into a properties file that is classpath loadable at runtime. Any insights you can give to help me get this done are greatly appreciated.
The sbt-buildinfo is a good option. The README shows an example of how to define custom mappings and mappings that should run on each compile. In addition to the straightforward addition of normal settings like version shown there, you want a section like this:
buildInfoKeys ++= Seq[BuildInfoKey](
"hostname" -> java.net.InetAddress.getLocalHost().getHostName(),
"whoami" -> System.getProperty("user.name"),
BuildInfoKey.action("buildTimestamp") {
java.text.DateFormat.getDateTimeInstance.format(new java.util.Date())
}
)
Would the following be what you're looking for:
sbt-editsource: An SBT plugin for editing files
sbt-editsource is a text substitution plugin for SBT 0.11.x and
greater. In a way, it’s a poor man’s sed(1), for SBT. It provides the
ability to apply line-by-line substitutions to a source text file,
producing an edited output file. It supports two kinds of edits:
Variable substitution, where ${var} is replaced by a value. sed-like
regular expression substitution.
This is from Community Plugins.