Has anyone published an sbt-native-packager produced artifact (tgz in my case) using sbt-aether-deploy to a nexus repo? (I need this for the timestamped snapshots, specifically the "correct" version tag in nexus' artifact-resolution REST resource).
I can do one or the other but can't figure out how to add the packagedArtifacts in Universal to the artifacts that sbt-aether-deploy deploys to do both.
I suspect the path to pursue would be to the addArtifact() the packagedArtifacts in Universal or creating another AetherArtifact and then to override/replace the deployTask to use that AetherArtifact?
Any help much appreciated.
I am the author of the sbt-aether-deploy plugin, and I just came over this post.
import aether.AetherKeys._
crossPaths := false //needed if you want to remove the scala version from the artifact name
enablePlugins(JavaAppPackaging)
aetherArtifact := {
val artifact = aetherArtifact.value
artifact.attach((packageBin in Universal).value, "dist", "zip")
}
This will also publish the other main artifact.
If you want to disable publishing of the main artifact, then you will need to rewrite the artifact coordinates. Maven requires a main artifact.
I have added a way to replace the main artifact for this purpose, but I can now see that way is kind of flawed. It will still assume that the artifact is published as a jar file. The main artifact type is locked down to that, since the POM packaging is set to jar by default by SBT.
If this is an app, then that limitation is probably OK, since Maven will never resolve that into an artifact.
The "proper" way in Maven terms is to add a classifier to the artifact and change the "packaging" in the POM file to "pom". We will see if I get around to changing that particular part.
Ok, I think I got it amazingly enough. If there's a better way to do it I'd love to hear. Not loving that blind Option.get there..
val tgzCoordinates = SettingKey[MavenCoordinates]("the maven coordinates for the tgz")
lazy val myPackagerSettings = packageArchetype.java_application ++ deploymentSettings ++ Seq(
publish <<= publish.dependsOn(publish in Universal),
publishLocal <<= publishLocal.dependsOn(publishLocal in Universal)
)
lazy val defaultSettings = buildSettings ++ Publish.settings ++ Seq(
scalacOptions in Compile ++= Seq("-encoding", "UTF-8", "-target:jvm-1.7", "-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls"),
testOptions in Test += Tests.Argument("-oDF")
)
lazy val myAetherSettings = aetherSettings ++ aetherPublishBothSettings
lazy val toastyphoenixProject = Project(
id = "toastyphoenix",
base = file("."),
settings = defaultSettings ++ myPackagerSettings ++ myAetherSettings ++ Seq(
name in Universal := name.value + "_" + scalaBinaryVersion.value,
packagedArtifacts in Universal ~= { _.filterNot { case (artifact, file) => artifact.`type`.contains("zip")}},
libraryDependencies ++= Dependencies.phoenix,
tgzCoordinates := MavenCoordinates(organization.value + ":" + (name in Universal).value + ":tgz:" + version.value).get,
aetherArtifact <<= (tgzCoordinates, packageZipTarball in Universal, makePom in Compile, packagedArtifacts in Universal) map {
(coords: MavenCoordinates, mainArtifact: File, pom: File, artifacts: Map[Artifact, File]) =>
createArtifact(artifacts, pom, coords, mainArtifact)
}
)
)
I took Peter's solution and reworked it slightly, avoiding the naked Option.get by creating the MavenCoordinates directly:
import aether.MavenCoordinates
import aether.Aether.createArtifact
name := "mrb-test"
organization := "me.mbarton"
version := "1.0"
crossPaths := false
packageArchetype.java_application
publish <<= (publish) dependsOn (publish in Universal)
publishLocal <<= (publishLocal) dependsOn (publishLocal in Universal)
aetherPublishBothSettings
aetherArtifact <<= (organization, name in Universal, version, packageBin in Universal, makePom in Compile, packagedArtifacts in Universal) map {
(organization, name, version, binary, pom, artifacts) =>
val nameWithoutVersion = name.replace(s"-$version", "")
createArtifact(artifacts, pom, MavenCoordinates(organization, nameWithoutVersion, version, None, "zip"), binary)
}
The nameWithoutVersion replace works around SBT native packager including the version in the artifact name:
Before: me/mbarton/mrb-test-1.0/1.0/mrb-test-1.0.zip
After: me/mbarton/mrb-test/1.0/mrb-test-1.0.zip
crossPaths avoids the Scala postfix on the version.
Related
I have an SBT project which pulls in dependencies. I only want to pull in the direct dependencies - not any transitive dependencies. I'd like to find the filename of the dependency that's pulled in, so that I can copy it somewhere.
e.g. given a build.sbt file with the following contents:
libraryDependencies += "org.eclipse.jetty" % "jetty-server" % "9.4.28.v20200408"
I would like to know where is the jetty-server jar on the file system.
I have tried adding the following to my build.sbt file:
lazy val mytaskKey: TaskKey[Unit] = TaskKey[Unit]("mytask")
def mytask: Def.Setting[Task[Unit]] = mytaskKey := {
val updateReport = update.value
updateReport.allFiles foreach { f =>
println(f)
}
}
mytask
When I run this, I get a full list of dependencies:
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-library.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/eclipse/jetty/jetty-server/9.4.28.v20200408/jetty-server-9.4.28.v20200408.jar
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-compiler.jar
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-reflect.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/scala-lang/modules/scala-xml_2.12/1.0.6/scala-xml_2.12-1.0.6.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/jline/jline/2.14.6/jline-2.14.6.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/fusesource/jansi/jansi/1.12/jansi-1.12.jar
I don't want that full list - I just want the jetty jar. i.e.
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/eclipse/jetty/jetty-server/9.4.28.v20200408/jetty-server-9.4.28.v20200408.jar
How might I get this list?
Yes, there is with either intransitive() or notTransitive() classifiers. It's documented here.
Good day! Help me, please. I startup this example
sbt> run
It's okey all play, after
sbt> package
Will build jar file, after double click messge:
Error: A JNI error has occured, please check your installation and try again.
Scala version: 2.12.4. JVM:1.8.0_152. ScalaFX:8.0.102-R11
hello.scala: `
package hello
import scalafx.Includes._
import scalafx.application.JFXApp
import scalafx.application.JFXApp.PrimaryStage
import scalafx.scene.Scene
import scalafx.scene.paint.Color._
import scalafx.scene.shape.Rectangle
object HelloStage extends JFXApp {
stage = new JFXApp.PrimaryStage {
title.value = "Hello Stage"
width = 600
height = 450
scene = new Scene {
fill = LightGreen
content = new Rectangle {
x = 25
y = 40
width = 100
height = 100
fill <== when(hover) choose Green otherwise Red
}
}
}
}
build.sbt:
name := "Scala"
organization := "scalafx.org"
version := "1.0.5"
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-Xcheckinit", "-encoding", "utf8")
resourceDirectory in Compile := (scalaSource in Compile).value
libraryDependencies ++= Seq(
"org.scalafx" %% "scalafx" % "8.0.102-R11",)
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
fork := true
This is a Java classpath issue. When you try to execute the resulting JAR file, it cannot find the jar files that it needs to run.
Try the following:
Firstly, copy & paste the following to project/plugins.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
This loads the sbt-assembly plugin, which will create a fat JAR file, containing all of the dependencies.
Secondly, change your build.sbt file to the following:
name := "Scala"
organization := "scalafx.org"
version := "1.0.5"
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-Xcheckinit", "-encoding", "utf8")
libraryDependencies += "org.scalafx" %% "scalafx" % "8.0.102-R11"
fork := true
mainClass in assembly := Some("hello.HelloStage")
This simplifies what you originally had. The macro paradise compiler plugin is not required, and I also removed the slightly odd resourceDirectory setting.
To create the fat JAR, run the command:
sbt
sbt> assembly
The JAR file you're looking for is most likely located at target/scala-2.12/Scala-assembly-1.0.5.jar. You should now be good to go...
Alternatively, you can add all the necessary JAR files to your classpath. Another plugin that can help with that (you probably shouldn't use it with sbt-assembly) - is sbt-native-packager, which creates installers for you. You can then install your app and run it like a regular application.
I'm building a Docker image with a fat jar. I use the sbt-assembly plugin to build the jar, and the sbt-native-packager to build the Docker image. I'm not very familiar with SBT and am running into the following issues.
I'd like to declare a dependency on the assembly task from the docker:publish task, such that the fat jar is created before it's added to the image. I did as instructed in the doc, but it's not working. assembly doesn't run until I invoke it.
publish := (publish dependsOn assembly).value
One of the steps in building the image is copying the fat jar. Since assembly plugin creates the jar in target/scala_whatever/projectname-assembly-X.X.X.jar, I need to know the exact scala_whatever and the jar name. assembly seems to have a key assemblyJarName but I'm not sure how to access it. I tried the following which fails.
Cmd("COPY", "target/scala*/*.jar /app.jar")
Help!
Answering my own questions, the following works:
enablePlugins(JavaAppPackaging, DockerPlugin)
assemblyMergeStrategy in assembly := {
case x => {
val oldStrategy = (assemblyMergeStrategy in assembly).value
val strategy = oldStrategy(x)
if (strategy == MergeStrategy.deduplicate)
MergeStrategy.first
else strategy
}
}
// Remove all jar mappings in universal and append the fat jar
mappings in Universal := {
val universalMappings = (mappings in Universal).value
val fatJar = (assembly in Compile).value
val filtered = universalMappings.filter {
case (file, name) => !name.endsWith(".jar")
}
filtered :+ (fatJar -> ("lib/" + fatJar.getName))
}
dockerRepository := Some("username")
import com.typesafe.sbt.packager.docker.{Cmd, ExecCmd}
dockerCommands := Seq(
Cmd("FROM", "username/spark:2.1.0"),
Cmd("WORKDIR", "/"),
Cmd("COPY", "opt/docker/lib/*.jar", "/app.jar"),
ExecCmd("ENTRYPOINT", "/opt/spark/bin/spark-submit", "/app.jar")
)
I completely overwrite the docker commands because the defaults add couple of scripts that I don't need because I overwrite the entrypoint as well. Also, the default workdir is /opt/docker which is not where I want to put the fat jar.
Note that the default commands are shown by show dockerCommands in sbt console.
I have a Scala project that is divided into several subprojects:
lazy val core: Seq[ProjectReference] = Seq(common, json_scalaz7, json_scalaz)
I'd like to make the core lazy val conditional on the Scala version I'm currently using, so I tried this:
lazy val core2: Seq[ProjectReference] = scalaVersion {
case "2.11.0" => Seq(common, json_scalaz7)
case _ => Seq(common, json_scalaz7, json_scalaz)
}
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0 (when the value of the scalaVersion setting is "2.11.0").
This however gives me the following compilation error:
[error] /home/diego/work/lift/framework/project/Build.scala:39: type mismatch;
[error] found : sbt.Project.Initialize[Seq[sbt.Project]]
[error] required: Seq[sbt.ProjectReference]
[error] lazy val core2: Seq[ProjectReference] = scalaVersion {
[error] ^
[error] one error found
Any idea how to solve this?
Update
I'm using sbt version 0.12.4
This project is the Lift project, which compiles against "2.10.0", "2.9.2", "2.9.1-1", "2.9.1" and now we are working on getting it to compile with 2.11.0. So creating a compile all task would not be practical, as it would take a really long time.
Update 2
I'm hoping there is something like this:
lazy val scala_xml = "org.scala-lang.modules" %% "scala-xml" % "1.0.1"
lazy val scala_parser = "org.scala-lang.modules" %% "scala-parser-combinators" % "1.0.1"
...
lazy val common =
coreProject("common")
.settings(description := "Common Libraties and Utilities",
libraryDependencies ++= Seq(slf4j_api, logback, slf4j_log4j12),
libraryDependencies <++= scalaVersion {
case "2.11.0" => Seq(scala_xml, scala_parser)
case _ => Seq()
}
)
but for the projects list
Note how depending on the scala version, I add the scala_xml and scala_parser_combinator libraries
You can see the complete build file here
Cross building a project
Simply speaking, I'd like to exclude json_scalaz for Scala 2.11.0
The built-in support in sbt for this is called cross building, which is described in Cross-Building a Project. Here's from the section with a bit of correction:
Define the versions of Scala to build against in the crossScalaVersions setting. For example, in a .sbt build definition:
crossScalaVersions := Seq("2.10.4", "2.11.0")
To build against all versions listed crossScalaVersions, prefix the action to run with +. For example:
> +compile
Multiple-project builds
sbt also has built-in support to aggregate tasks across multiple projects, which is described Aggregation. If what you need eventually is normal built-in tasks like compile and test, you could set up a dummy aggregate without json_scalaz.
lazy val withoutJsonScalaz = (project in file("without-json-scalaz")).
.aggregate(liftProjects filterNot {_ == json_scalaz}: _*)
From the shell, you should be able to use this as:
> ++2.11.0
> project withoutJsonScalaz
> test
Getting values from multiple scopes
Another feature you might be interested in is ScopeFilter. This has the ability to traverse multiple projects beyond usual aggregation and cross building. You would need to create a setting whose type is ScopeFilter and set it based on scalaBinaryVersion.value. With scope filters, you can do:
val coreProjects = settingKey[ScopeFilter]("my core projects")
val compileAll = taskKey[Seq[sbt.inc.Analysis]]("compile all")
coreProjects := {
(scalaBinaryVersion.value) match {
case "2.10" => ScopeFilter(inProjects(common, json_scalaz7, json_scalaz))
}
}
compileAll := compileAllTask.value
lazy val compileAllTask = Def.taskDyn {
val f = coreProjects.value
(compile in Compile) all f
}
In this case compileAll would have the same effect as +compile, but you could aggregate the result and do something interesting like sbt-unidoc.
I am trying to include the breeze-natives dependency only when packaging the app (universal:packageBin and debian:packageBin) while always including the breeze dependency. Here is what I came up with :
val breezeDependencySettings = {
val breezeUniversalNativesDependency = libraryDependencies in Universal += D.breezeNatives
val breezeDebianNativesDependency = libraryDependencies in Debian += D.breezeNatives
val breezeDependency = libraryDependencies += D.breeze
Seq(breezeUniversalNativesDependency, breezeDebianNativesDependency, breezeDependency)
}
And in the project that I want to package, I use
settings = (mySettings) ++ SbtNativePackager.packageArchetype.java_server ++
Dependencies.breezeDependencySettings
However, the breeze-natives dependency is not included in the final package created by universal:packageBin. (breeze is included correctly though)
What am I doing wrong?
Not 100% clear on your requirement but have you tried ExportJars := true?
See the excerpt from my build in my question here: https://stackoverflow.com/questions/23035100/how-to-remove-version-from-artifactid-generated-by-sbt-native-packager