How to extract dependency jar to specific folder during compilation? - sbt

These are my project's dependencies:
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"com.company" % "common_2.11" % "2.3.3"
)
In com.company.common_2.11-2.3.3 there is a jar file common_2.11-2.3.3-adtnl.jar.
How can I during compilation process in build.sbt tell SBT to extract its contents to specific folder in my project?

Use the following in build.sbt:
def unpackjar(jar: File, to: File): File = {
println(s"Processing $jar and saving to $to")
IO.unzip(jar, to)
jar
}
resourceGenerators in Compile += Def.task {
val jar = (update in Compile).value
.select(configurationFilter("compile"))
.filter(_.name.contains("common"))
.head
val to = (target in Compile).value / "unjar"
unpackjar(jar, to)
Seq.empty[File]
}.taskValue
It assumes that "common" is a unique part amongst all of your dependencies. You'd need to fix filter otherwise.
It also assumes that you don't really want the files at compile, but a bit later when package is called. You'd need to move the code between Def.task{...} to compile in Compile block like:
compile in Compile <<= (compile in Compile).dependsOn(Def.task {
val jar = (update in Compile).value
.select(configurationFilter("compile"))
.filter(_.name.contains("common"))
.head
val to = (target in Compile).value / "unjar"
unpackjar(jar, to)
Seq.empty[File]
})

Related

When using a Scala compiler plugin in sbt, how do you set a library dependency for the plugin?

I'm using a compiler plugin I wrote that depends on the Kyro serialization library. When attempting to use my plugin I set this up in build.sbt (top-level) like this:
lazy val dependencies =
new {
val munit = "org.scalameta" %% "munit" % "0.7.12" % Test
val kyro = "com.esotericsoftware" % "kryo" % "5.0.0-RC9"
}
lazy val commonDependencies = Seq(
dependencies.kyro,
dependencies.munit
)
lazy val root = (project in file("."))
.settings(
libraryDependencies ++= commonDependencies,
Test / parallelExecution := false
)
addCompilerPlugin("co.blocke" %% "dotty-reflection" % reflectionLibVersion)
But when I compile my target project, I get a java.lang.NoClassDefFoundError that it can't find Kyro. I've added kyro to my dependencies, but since this is for the compiler, not my app, it's not picking that up.
How can I properly tell sbt about a dependency my plugin needs?

scalaFX standalone execute jar file

Good day! Help me, please. I startup this example
sbt> run
It's okey all play, after
sbt> package
Will build jar file, after double click messge:
Error: A JNI error has occured, please check your installation and try again.
Scala version: 2.12.4. JVM:1.8.0_152. ScalaFX:8.0.102-R11
hello.scala: `
package hello
import scalafx.Includes._
import scalafx.application.JFXApp
import scalafx.application.JFXApp.PrimaryStage
import scalafx.scene.Scene
import scalafx.scene.paint.Color._
import scalafx.scene.shape.Rectangle
object HelloStage extends JFXApp {
stage = new JFXApp.PrimaryStage {
title.value = "Hello Stage"
width = 600
height = 450
scene = new Scene {
fill = LightGreen
content = new Rectangle {
x = 25
y = 40
width = 100
height = 100
fill <== when(hover) choose Green otherwise Red
}
}
}
}
build.sbt:
name := "Scala"
organization := "scalafx.org"
version := "1.0.5"
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-Xcheckinit", "-encoding", "utf8")
resourceDirectory in Compile := (scalaSource in Compile).value
libraryDependencies ++= Seq(
"org.scalafx" %% "scalafx" % "8.0.102-R11",)
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
fork := true
This is a Java classpath issue. When you try to execute the resulting JAR file, it cannot find the jar files that it needs to run.
Try the following:
Firstly, copy & paste the following to project/plugins.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
This loads the sbt-assembly plugin, which will create a fat JAR file, containing all of the dependencies.
Secondly, change your build.sbt file to the following:
name := "Scala"
organization := "scalafx.org"
version := "1.0.5"
scalaVersion := "2.12.4"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-Xcheckinit", "-encoding", "utf8")
libraryDependencies += "org.scalafx" %% "scalafx" % "8.0.102-R11"
fork := true
mainClass in assembly := Some("hello.HelloStage")
This simplifies what you originally had. The macro paradise compiler plugin is not required, and I also removed the slightly odd resourceDirectory setting.
To create the fat JAR, run the command:
sbt
sbt> assembly
The JAR file you're looking for is most likely located at target/scala-2.12/Scala-assembly-1.0.5.jar. You should now be good to go...
Alternatively, you can add all the necessary JAR files to your classpath. Another plugin that can help with that (you probably shouldn't use it with sbt-assembly) - is sbt-native-packager, which creates installers for you. You can then install your app and run it like a regular application.

Annotation Processor output path sbt?

Is there any way to set the Annotation processor output path in sbt?
Currently it generates the files into:
target/scala-2.11/classes
However I would prefer
target/scala-2.11/src_managed
Something like
// in build.sbt:
// create managed source directory before compile
compile in Compile <<= (compile in Compile) dependsOn Def.task { (managedSourceDirectories in Compile).value.head.mkdirs() },
// tell the java compiler to output generated source files to the managed source directory
javacOptions in Compile ++= Seq("-s", (managedSourceDirectories in Compile).value.head.getAbsolutePath),
It's slightly more ergonomic to configure sourceManaged instead of managedSourceDirectories.
Add to a sbt module's settings in build.sbt:
Compile / javacOptions ++= Seq("-s", (Compile / sourceManaged).value.getAbsolutePath)
You can also drop this plugin into the project folder
package custom.sbt
import sbt.{Def, _}
import sbt.Keys._
object Compiler extends AutoPlugin {
override def trigger = allRequirements
override def buildSettings: Seq[Def.Setting[_]] = Seq(
Compile / javacOptions ++= Seq("-source", "11", "-target", "11"),
scalacOptions ++= Seq(
"-target:11" // Target JRE 11
)
)
override def projectSettings: Seq[Def.Setting[_]] = Seq(
Compile / javacOptions ++= Seq("-s", (Compile / sourceManaged).value.getAbsolutePath)
)
}
in sbt 0.13.15
compile := ((compile in Compile) dependsOn Def.task {
(sourceManaged in Compile).value.mkdirs()
}).value,
javacOptions in Compile ++= Seq("-s", s"${sourceManaged.value}/main")

Conditional libraries in build.sbt

Using buildsbt. I'm trying to do something like this:
if (condition) {
libraryDependencies += ... // library from maven
}
else {
unmanagedJars in Compile += ... // local library instead
}
However, build.sbt doesn't like this at all. I've been able to accomplish this using side effects, but that's obviously undesirable. Any advice would be appreciated. Thanks.
You can do the following:
val additionalLibraryDependencies = Seq(...)
val additionalUnmanagedJars = Seq(...)
libraryDependencies ++=(
if (condition) {
additionalLibraryDependencies
}
)
unmanagedJars in Compile ++= (
if (!condition) {
additionalUnmanagedJars
}
)
To set the condition from command line you should add the following lines:
val someValueFromCommandLine = System.getProperty("key.of.the.value", "false")
if (someValueFromCommandLine.equals("true")){
...
}
You can pass it like sbt -Dkey.of.the.value=true
This might be easier to do using a Build.scala build definition.
Here is an example build.sbt file (this is optional):
import sbt._
import sbt.Keys._
libraryDependencies ++= Seq(
"org.postgresql" % "postgresql" % "9.3-1101-jdbc41",
"redis.clients" % "jedis" % "2.5.2"
)
Then create another file your_project_home/project/Build.scala
import sbt._
import Keys._
object BuildSettings {
val condition = false
val buildSettings = Defaults.defaultSettings ++ Seq(
version := "1.0.0",
if(condition) libraryDependencies ++= Seq("commons-codec" % "commons-codec" % "1.9")
else unmanagedJars in Compile += file("/tmp/nil")
)
}
object MyBuild extends Build {
import BuildSettings._
lazy val root: Project = Project("root", file("."), settings = buildSettings)
}
Your project structure should look like this:
.
├── build.sbt
├── project
│   └── Build.scala
└── target
You can make the "condition" to be whatever what you need (here I just set it to false). The libraryDependencies defined inside build.sbt will always be included. The ones defined in Build.scala will depend on the "condition."
Verify that everything works as expected from the command line using:
sbt "inspect libraryDependencies"

Unzipping an artifact with SBT

As part of my project build, I'd like to unzip a zip artifact of a managed dependency into a specific directory of the project. Before starting to use SBT I was doing this via an ANT script that would fetch the zip artifact from a maven dependency and unzip it.
My question(s) are:
how to specify that I want to depend on the zip dependency? I have defined it like so:
"eu.delving" % "sip-creator" % "0.4.6-SNAPSHOT"
but this doesn't fetch the zip artifact
where / how to hook into the build process to run the unzip (and how to refer to the artifact file in that context)?
If you want to extract a set of managed dependencies, the code below should work. I tested it in sbt 0.12.0, but it should also work with 0.11.x.
import sbt._
import Keys._
import Classpaths.managedJars
object TestBuild extends Build {
lazy val jarsToExtract = TaskKey[Seq[File]]("jars-to-extract", "JAR files to be extracted")
lazy val extractJarsTarget = SettingKey[File]("extract-jars-target", "Target directory for extracted JAR files")
lazy val extractJars = TaskKey[Unit]("extract-jars", "Extracts JAR files")
lazy val testSettings = Defaults.defaultSettings ++ Seq(
// define dependencies
libraryDependencies ++= Seq(
"com.newrelic" % "newrelic-api" % "2.2.1"
),
// collect jar files to be extracted from managed jar dependencies
jarsToExtract <<= (classpathTypes, update) map { (ct, up) =>
managedJars(Compile, ct, up) map { _.data } filter { _.getName.startsWith("newrelic-api") }
},
// define the target directory
extractJarsTarget <<= (baseDirectory)(_ / "extracted"),
// task to extract jar files
extractJars <<= (jarsToExtract, extractJarsTarget, streams) map { (jars, target, streams) =>
jars foreach { jar =>
streams.log.info("Extracting " + jar.getName + " to " + target)
IO.unzip(jar, target)
}
},
// make it run before compile
compile in Compile <<= extractJars map { _ => sbt.inc.Analysis.Empty }
)
lazy val test: Project = Project("test", file(".")) settings (testSettings: _*)
}
If you simply have jar files to extract, you can add them as unmanaged dependencies, ie. putting them into the /lib folder. See: https://github.com/harrah/xsbt/wiki/Getting-Started-Library-Dependencies
If you really have zip files (or want to extract the unmanaged dependencies), you can change the above code to list them:
// list jar files to be extracted
jarsToExtract <<= (baseDirectory) map { dir => Seq(dir / "lib" / "newrelic-api-2.2.1.zip") },
You should now be able to manually extract them from sbt and they should automatically be extracted before compile:
> clean
[success] Total time: 0 s, completed Oct 12, 2012 5:39:16 PM
> extract-jars
[info] Extracting newrelic-api-2.2.1.zip to /Users/balagez/Sites/test/extracted
[success] Total time: 0 s, completed Oct 12, 2012 5:39:22 PM
> compile
[info] Extracting newrelic-api-2.2.1.zip to /Users/balagez/Sites/test/extracted
[success] Total time: 0 s, completed Oct 12, 2012 5:39:24 PM
Now you can add a new task or extend the existing one which extracts the zip file from the extracted dependency. If you don't need the contents of the dependency, you can use the task-temporary-directory setting which gives you a temporary directory writable by sbt:
// define the target directory
extractJarsTarget <<= taskTemporaryDirectory,

Resources