Exclude plugin's libraryDependencies from released application when "package" in SBT - sbt

In my SBT project I use sbt-scoverage plugin. I did what the documentation says and added ScoverageSbtPlugin.instrumentSettings to build.sbt. Everything works great so far.
When I package my app I can see in pom.xml that there is a dependency that should not be there:
<dependency>
<groupId>com.sksamuel.scoverage</groupId>
<artifactId>scalac-scoverage-plugin</artifactId>
<version>0.95.4</version>
</dependency>
This is a library dependency of the sbt-scoverage plugin that I don't want to have as a dependency in my released app.
I believe that this dependency is created by the following code in ScoverageSbtPlugin.scala:
libraryDependencies += "com.sksamuel.scoverage" %% "scalac-scoverage-plugin" %
ScalacScoveragePluginVersion % scoverage.name
Can anyone tell me how to make this dependency to be added only when I run sbt scoverage:test?

The way that scoverage seems to be written forces this dependency to be added to libraryDependencies in Compile, as you've noticed. However one workaround is to use the makePomConfiguration Setting in sbt. You can perform transformations on the constructed POM to remove the added dependency, without affecting how scoverage works. Below I've made a build that will filter out the scoverage dependency in your POM. I've used a .scala file as you can't define objects in build.sbt pre 0.13. So this file would be located at project/Build.scala.
import sbt.Keys._
import sbt._
import scala.xml.{Elem, Node}
import scala.xml.transform.{RuleTransformer, RewriteRule}
object theBuild extends Build {
object FilterBadDependency extends RewriteRule {
override def transform(n: Node): Seq[Node] = n match {
/**
* When we find the dependencies node we want to rewrite it removing any of
* the scoverage dependencies.
*/
case dependencies # Elem(_, "dependencies", _, _, _*) =>
<dependencies>
{
dependencies.child filter { dep =>
(dep \ "groupId").text != "com.sksamuel.scoverage"
}
}
</dependencies>
/**
* Otherwise we just skip over the node and do nothing
*/
case other => other
}
}
object TransformFilterBadDependencies extends RuleTransformer(FilterBadDependency)
val project = Project(
id = "test-build",
base = file(".")
).settings(
ScoverageSbtPlugin.instrumentSettings: _*
).settings(
/**
* Here we alter our make pom configuration so that our transformation is applied to
* the constructed pom
*/
makePomConfiguration ~= { config =>
config.copy(process = TransformFilterBadDependencies)
})
}

I came to following solution. I have replaced this:
ivyConfigurations ++= Seq(scoverage, scoverageTest)
with this:
ivyConfigurations ++= Seq(scoverage hide, scoverageTest hide)
Here's the changeset:
https://github.com/scoverage/sbt-scoverage/commit/6d7ebe07482933f588e9feb23f80eeed2aa14f62
I would appreciate anybody's view on that. It works "on my machine".

Related

In SBT, Is there a way of just downloading the top-level dependencies?

I have an SBT project which pulls in dependencies. I only want to pull in the direct dependencies - not any transitive dependencies. I'd like to find the filename of the dependency that's pulled in, so that I can copy it somewhere.
e.g. given a build.sbt file with the following contents:
libraryDependencies += "org.eclipse.jetty" % "jetty-server" % "9.4.28.v20200408"
I would like to know where is the jetty-server jar on the file system.
I have tried adding the following to my build.sbt file:
lazy val mytaskKey: TaskKey[Unit] = TaskKey[Unit]("mytask")
def mytask: Def.Setting[Task[Unit]] = mytaskKey := {
val updateReport = update.value
updateReport.allFiles foreach { f =>
println(f)
}
}
mytask
When I run this, I get a full list of dependencies:
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-library.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/eclipse/jetty/jetty-server/9.4.28.v20200408/jetty-server-9.4.28.v20200408.jar
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-compiler.jar
/Users/dylan/.sbt/boot/scala-2.12.10/lib/scala-reflect.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/scala-lang/modules/scala-xml_2.12/1.0.6/scala-xml_2.12-1.0.6.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/jline/jline/2.14.6/jline-2.14.6.jar
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/fusesource/jansi/jansi/1.12/jansi-1.12.jar
I don't want that full list - I just want the jetty jar. i.e.
/Users/dylan/.coursier/cache/v1/https/repo1.maven.org/maven2/org/eclipse/jetty/jetty-server/9.4.28.v20200408/jetty-server-9.4.28.v20200408.jar
How might I get this list?
Yes, there is with either intransitive() or notTransitive() classifiers. It's documented here.

Additional resource generator with sbt native packager

I have a submodule, that is compiled by invoking external command. I would like to include generated file into jar. So I wrote a task:
```
myTask := {
import sys.process.stringSeqToProcess
Seq("my", "command") !
}
unmanagedResourceDirectories in Compile += baseDirectory.value / "dist"
cleanFiles <+= baseDirectory { base => base / "dist" }
Keys.`package` <<= (Keys.`package` in Compile) dependsOn npmBuildTask.toTask
and when I invoke mySubmodule/package task it works well. But when I invoke stage task from sbt-native-packager my task is ignored(is not executed).
There are a couple of options to solve this issue. I assume you want to add the dist folder to your resulting application jar.
Your configuration doesn't work because stage doesn't depend on package. This results npmBuildTask not being called.
1. Add dependency to stage
The easiest way to fix this is by simply adding the npmBuildTask as a dependency to stage
stage <<= stage dependsOn npmBuildTask.toTask
I wouldn't recommend this approach.
2. Resource generators
SBTs Resoure Generators are exactly defined for this purpose. An inline version could look like this
resourceGenerators in Compile += Def.task {
streams.value.log.info("running npm generator")
val base = (resourceManaged in Compile).value / "dist"
// A resource generator returns a Seq[File]. This is just an example
List("index.js", "test.js").map { file =>
IO.writeLines(base / file, List("var x = 1"))
base / file
}
}.taskValue
Or you could extract this in an AutoPlugin to separate the "what" and "how.
3. AutoPlugin and resource generators
Create project/NpmPlugin.scala and add the following content
import sbt._
import sbt.Keys._
import sbt.plugins.JvmPlugin
object NpmPlugin extends AutoPlugin {
override val requires = JvmPlugin
override val trigger = AllRequirements
object autoImport {
val npmBuildTask = TaskKey[Seq[File]]("npm-build-task", "Runs npm and builds the application")
}
import autoImport._
override def projectSettings: Seq[Setting[_]] = Seq(
// define a custom target directory for npm
target in npmBuildTask := target.value / "npm",
// the actual build task
npmBuildTask := {
val npmSource = (target in npmBuildTask).value
val npmTarget = (resourceManaged in Compile).value / "dist"
// run npm here, which generates the necessary values
streams.value.log.info("running npm generator")
// move generated sources to target folder
IO.copyDirectory(npmSource, npmTarget)
// recursively get all files in the npmTarget
(npmTarget ***).get
},
resourceGenerators in Compile += npmBuildTask.taskValue
)
}
The build.sbt will then look like this
name := "resource-gen-test"
version := "1.0"
enablePlugins(JavaAppPackaging)
Pretty clean :)
4. Use mappings
Last but not least you could use mappings. They are the low level detail that drives a lot of the package-generation in sbt. The main idea of this solution is to
Create a task that returns a mapping definition ( Seq[(File, String)] )
Append this to the appropriate mappings
The advantage of this approach is that you are more flexible where you want to put your mappings.
import sbt._
import sbt.Keys._
import sbt.plugins.JvmPlugin
import com.typesafe.sbt.SbtNativePackager.Universal
import com.typesafe.sbt.SbtNativePackager.autoImport.NativePackagerHelper._
object NpmMappingsPlugin extends AutoPlugin {
override val requires = JvmPlugin
override val trigger = AllRequirements
object autoImport {
val npmBuildTask = TaskKey[Seq[(File, String)]]("npm-build-task", "Runs npm and builds the application")
}
import autoImport._
override def projectSettings: Seq[Setting[_]] = Seq(
// define a custom target directory for npm
target in npmBuildTask := target.value / "npm" / "dist",
// the actual build task
npmBuildTask := {
val npmTarget = (target in npmBuildTask).value
// run npm here, which generates the necessary values
streams.value.log.info("running npm generator")
// recursively get all files in the npmTarget
// contentOf(npmTarget) would skip the top-level-directory
directory(npmTarget)
},
// add npm resources to the generated jar
mappings in (Compile, packageBin) ++= npmBuildTask.value,
// add npm resources to resulting package
mappings in Universal ++= npmBuildTask.value
)
}
As you can see in this approach we can easily add the resulting files to different mappings.
However I only recommend this approach if you need this kind of flexibility as it requires a bit more knowledge of native-packager.

Adding sbt native packager plugin in SBT

I have a very organized build file that is composed of the following scala files:
Build.scala - the main Build file
Dependencies.scala - where I define the dependencies and the versions
BuildSettings.scala - where I define the build settings
plugins.sbt
A snippet of the Build.scala is as below:
import sbt._
import Keys._
object MyBuild extends Build {
import Dependencies._
import BuildSettings._
import NativePackagerHelper._
// Configure prompt to show current project
override lazy val settings = super.settings :+ {
shellPrompt := { s => Project.extract(s).currentProject.id + " > " }
}
// Define our project, with basic project information and library dependencies
lazy val project = Project("my-project", file("."))
.settings(buildSettings: _*)
.settings(
libraryDependencies ++= Seq(
Libraries.scalaAsync
// Add your additional libraries here (comma-separated)...
)
).enablePlugins(JavaAppPackaging, DockerPlugin)
}
All the 4 files that I mentioned above are in the same directory which is inside the project directory. But when I run this build file, I get the following error:
not found value: NativePackagerHelper
Any clues why his this?
I figured out what the problem was. I had to use the following in my build.properties
sbt.version=0.13.11
I originally had 0.13.6 and it was causing the import statements to fail!

Include a simple val in sbt build files from global.sbt

I wish to set my version numbers externally across several build.sbt files through a single include file.
Within build.sbt I can do this
val base = "1.1"
version := base + ".8-SNAPSHOT"
This works fine as a first step.
According the the online help I should be able to create a file global.sbt in my ~/.sbt/0.13 folder
I created the file global.sbt with single line
val base = "1.1"
and removed the corresponding line from build.sbt
But when I start up my sbt I get "error: not found: value base"
So either it's not finding the global sbt or this form of global setting doesn't work.
Any suggestions as to how I can resolve this?
Can I make an explicit include command in my build.sbt files?
It seems from your test that vals in global ~/.sbt/0.13/*.sbt files don't propagate to local *.sbt files.
Here's a setup that works:
~/.sbt/0.13/plugins/VersionBasePlugin.scala
import sbt._, Keys._
object VersionBasePlugin extends AutoPlugin {
override def requires = plugins.CorePlugin
override def trigger = allRequirements
object autoImport {
val versionBase = settingKey[String]("version base")
}
import autoImport._
override def projectSettings = Seq(versionBase := "1.1")
}
and then in your build.sbt:
version := (versionBase.value + ".8-SNAPSHOT")
Does that work for you?

Trouble building a simple SparkSQL application

This is a pretty noob question.
I'm trying to learn about SparkSQL. I've been following the example described here:
http://spark.apache.org/docs/1.0.0/sql-programming-guide.html
Everything works fine in the Spark-shell, but when I try to use sbt to build a batch version, I get the following error message:
object sql is not a member of package org.apache.spark
Unfortunately, I'm rather new to sbt, so I don't know how to correct this problem. I suspect that I need to include additional dependencies, but I can't figure out how.
Here is the code I'm trying to compile:
/* TestApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
case class Record(k: Int, v: String)
object TestApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
val data = sc.parallelize(1 to 100000)
val records = data.map(i => new Record(i, "value = "+i))
val table = createSchemaRDD(records, Record)
println(">>> " + table.count)
}
}
The error is flagged on the line where I try to create a SQLContext.
Here is the content of the sbt file:
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
Thanks for the help.
As is often the case, the act of asking the question helped me figure out the answer. The answer is to add the following line in the sbt file.
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.0.0"
I also realized there is an additional problem in the little program above. There are too many arguments in the call to createSchemaRDD. That line should read as follows:
val table = createSchemaRDD(records)
Thanks! I ran into a similar problem while building a Scala app in Maven. Based on what you did with SBT, I added the corresponding Maven dependencies as follows and now I am able to compile and generate the jar file.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.2.1</version>
</dependency>
I got the similar issue, in my case, i just copy pasted the below sbt setup from online with scalaVersion := "2.10.4" but in my environment, i actually have the scala version 2.11.8
so updated & executed sbt package again, issue fixed
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"

Resources