Are sbt library dependencies order dependent? - sbt

Empirically, the order of declared library dependencies in a build.sbt appears to matter. Is this true? If so, it is worth a brief mention in the sbt library management section of the documentation.

Yes, the order listed is the order used to resolve dependencies. This includes defaults if you merely append to defaults. Therefore you should put less likely candidates after more likely candidates. In the following example, the default resolvers are checked, then Sonatype snapshots, then dependencies only available on the local machine in the .m2 directory:
resolvers ++= Seq(
Resolver.sonatypeRepo("snapshots"),
"Local .m2 Repository" at s"file:${ Path.userHome.absolutePath }/.m2/repository"
)
The defaults have changed over the years. To be certain you control the resolvers, another way to write this, without relying on defaults, is:
resolvers = Seq(
allResolvers,
Resolver.sonatypeRepo("snapshots"),
"Local .m2 Repository" at s"file:${ Path.userHome.absolutePath }/.m2/repository"
)

Related

Playframework 2.3.9 dependency override

As of Play 2.3, Play is added as an SBT Plugin as follows in my Build.scala as follows:
Project("root", file(".")).enablePlugins(play.PlayScala)
Also have a look at the documentation.
I need a specific dependeny updated, namely Fluentlenium (Play 2.3.9 still uses 0.9.3):
"org.fluentlenium" % "fluentlenium-core" % "0.10.3"
How can I replace the old version and replace it with a newer one? Simply adding the library to the libraryDependencies leaves me with both versions in the class path.
Edit: After digging a bit deeper, it seems as if the (new?) feature of dependencyOverrides that comes with SBT 13.8 could be a solution:
Overriding a version. But also have a look at Conflict Management from the very same documentation.
With this you can override single dependencies, which means that you have to override each transititve dependency manually.
Simply adding the library to the libraryDependencies leaves me with both versions in the class path.
Are you sure about this? sbt (Ivy) should evict the older one if there are multiple versions in the same configuration.
In most cases
libraryDependencies += "org.fluentlenium" % "fluentlenium-core" % "0.10.3"
should be ok, granted that 0.9.x are binary compatible with 0.10.x. If you want to make sure that it is overridden during transitive dependency resolution, dependencyOverrides might be the way to go:
dependencyOverrides += "org.fluentlenium" % "fluentlenium-core" % "0.10.3"

How to unpack dependency jars into the classpath in target?

I am using sbt-osgi to repackage some library dependencies into OSGi packages, and that works well, until I started using scalajs as well. The library dependencies are defined as normal projects something like this:
lazy val bonecp = OsgiProject("com.jolbox.bonecp", buddyPolicy = Some("global")) settings
(libraryDependencies += "com.jolbox" % "bonecp" % "0.8.0-rc1")
The OsgiProject function has default OSGi settings plus some implicits for determining what path the project has. When the bundle task is run on these projects, a new jar with OSGi stuff is created based on the OSGiProject settings. This project just rebundles the bonecp library as an OSGi jar and has no sources. The problem here is that since there's no source, theres no files in target/scala-2.11. This causes sbt-osgi to spit out a ton of ignorable errors, but scalajs is not as forgiving and refuses to do anything with these projects. Is there any good way to unpack the downloaded libraryDependency jars into target/scala-<scalaVersion>?

Create standalone jar using SBT

I was a heavy Maven user and now I'm gradually using SBT for some of my projects.
I'd like to know how could I use SBT to create a standalone Java project? This project should be packaged as a JAR file and this JAR file would be used as a dependency in another SBT project.
In Maven, I could tell in my pom.xml what type of artifact it should produce when I build it. Is there something similar that I can do in SBT?
There is a difference between standalone and making a project useable as a dependency or another project. In the first case, you would use a plugin such as sbt-assembly. What it will do is create one jar file containing the project class files along with all of its dependencies. If you write an application, what you get is a double-clickable jar that you can execute from anywhere.
If you want to use your project A as a dependency for another project B, you have different options. You could just package the class files of A, using sbt package (answer of #Channing Walton). Then you could drop the resulting .jar file in the lib directory of project B. However, if A also requires libraries, you must make sure that they also end up in project B's libraries.
A better approach is to publish your project. You can do that purely on your local machine, using sbt publish-local. That will store the jar as produced by package in a special local directory which can be accessed from sbt in another project, along with a POM file that contains the dependencies of A. It will use a group-ID (organization) and artifact-ID (name) and a version of your project A. For example, in build.sbt:
name := "projecta"
version := "0.1.0-SNAPSHOT"
organization := "com.github.myname"
scalaVersion := "2.10.3"
publishMavenStyle := true
After publishing with sbt publish-local, you can add the following dependency to your project B:
libraryDependencies += "com.github.myname" %% "projecta" % "0.1.0-SNAPSHOT"
If you have a pure Java project, you can omit the Scala version suffix, i.e. in Project A:
crossPaths := false
autoScalaLibrary := false
And then in Project B:
libraryDependencies += "com.github.myname" % "projecta" % "0.1.0-SNAPSHOT"
(using only one % character between group and artifact ID).
More on publishing in the sbt documentation.
'sbt package' will produce a jar file.
If you want it to be executable you need to add the following to your .sbt config:
mainClass in Compile := Some("your.main.Class")
Sure, you can use 'sbt package' command, it creates a jar file but this jar will be without any dependencies. To run it necessary to specify 'classpath' arg to the libraries.
In your case you wish a standalone runnable file. And you need to add the dependencies.
To do this you can use 'assembly' plugin for SBT, see https://github.com/sbt/sbt-assembly/
Afterward you can just run 'sbt assembly' command, it provides a fat jar file with all dependencies that you can deploy and run anywhere and at any time.
For details see this article
publishLocal
builds the artifact and publish in the local Ivy repository making it available for your local project dependencies.
publishM2
same as above, but the artifact is published in local Maven repo instead of Ivy repo.
I think the easiest way to produce a stand-alone jar with your project in it,
is sadly not lying inside sbt.
I personally use my IDE: Intellij to make the jar (through the 'build artifact' feature).
Thanks to Intellij I can easily choose which library I want to include in the jar or not, (for instance the scala stl).
IMHO, this is by far the simplest method to get an executable jar for your project.
If you put the scala stl you can run your jar with the "java -jar" command, if you don't you have to run it somewhere with the correct version of scala installed with "scala".

Handling unmanaged classpath jars in a library using SBT so that a dependent project can access them

I'm writing a library which depends on code (let's call it foo.jar) which is only available as a binary jar. As is standard, I'm putting this in the lib/ directory so SBT will treat is as an unmanaged dependency. This is fine so far.
However, since this is a library, I'd like to be able to publish it so that other projects which depend on it to also have access to the unmanaged code in foo.jar without having to manually locate it. I originally thought I could use a fat jar plugin such as SBT Assembly to create a jar with the dependencies, but that doesn't affect what is actually published using sbt publish-local – it only creates a fat jar when you run sbt assembly. Is there some standard simple way to handle this? It seems like a bad idea for every library which uses unmanaged dependencies to break when used by other projects downstream so I wonder if I'm missing something obvious.
I don't know if that's a good use of sbt-assembly, since other libraries could depend on a different version of foo.jar etc.
One way to work around it is to publish foo.jar in a Maven repository yourself. Some people in Scala and/or sbt community have been talking about bintray. It's still in beta, but looks promising if you want some jars published.
You might be able to get the result you want by manipulating the mappings in (Compile, packageBin) to include the files you want your packaged jar to have (publish uses the output from packageBin). This technique will allow you to include absolutely any file you want within the jar. The official sbt doc is here: http://www.scala-sbt.org/0.12.3/docs/Howto/package.html#contents
As an example, consider the common case of including a .properties file within your jar. Lets say you need to include "messages.properties" under the path "com/bigco/messages.properties" in your packaged jar. And lets say that this file is under src/main/scala/ ... You can add the following to your build.sbt:
mappings in (Compile, packageBin) <+= baseDirectory map { base =>
(base / "src" / "main" / "scala" / "com" / "bigco" / "messages.properties") -> "com/bigco/messages.properties"
}
To attempt to answer your original question, you could unzip foo.jar and add each one of the class files within to the packaged jar, according to their correct package paths. So something similar to
mappings in (Compile, packageBin) <+= baseDirectory map { base =>
(base / path / to / unzipped / file.class) -> "path.to.unzipped.file.class"
...
}
Or you might be able to get away with simply including foo.jar at the root of the packaged jar like so:
mappings in (Compile, packageBin) <+= baseDirectory map { base =>
(base / "lib" / "foo.jar") -> "foo.jar"
}

SBT doesn't seem to download transitive dependencies with a custom repository?

I'm new to SBT, using version 1.0 and a custom repository, and I've set the "retrieveManaged" flag mentioned here, but it seems that SBT only downloads the directly requested JARs, but not any of the JARs upon which those JARs depend. And yes, the repository is using the customary default format described in the answers here (though SBT/Ivy doesn't seem capable of retrieving snapshots, either, but that's a separate problem, I expect). The repository does not require any kind of authentication, FYI.
Here's a slightly generified version of my built.sbt file:
name := "MyProject"
organization := "com.myorg"
version := "0.1"
scalaVersion := "2.9.0"
scalacOptions += "-deprecation"
retrieveManaged := true
resolvers += Resolver.url("myorg", url("http://host.com//content/groups/public"))
libraryDependencies += "com.myorg" % "otherproject" % "1.0"
fork in run := true
The requested "otherproject" JAR file loads fine, but SBT/Ivy seems to have no interest in opening up its POM and downloading the other JARs it needs to operate. This seems like it should be a fairly basic function (Maven does it, for example) but I have no idea how to convince SBT/Ivy to do so. (And the documentation assures us that SBT is, in fact, supposed to do this: "By default, these declarations fetch all project dependencies, transitively".)
I believe I must be doing something wrong, but have no idea -- given how simple and vanilla this basic configuration is -- what it could be.
Standard, Maven-style repositories are declared like:
resolvers += "myorg" at "http://host.com/content/groups/public"
More details are at the Library Management page you linked to and on the Resolvers page.
Typically, one only uses Resolver.url when specifying non-standard layouts.

Resources