I'm using a compiler plugin I wrote that depends on the Kyro serialization library. When attempting to use my plugin I set this up in build.sbt (top-level) like this:
lazy val dependencies =
new {
val munit = "org.scalameta" %% "munit" % "0.7.12" % Test
val kyro = "com.esotericsoftware" % "kryo" % "5.0.0-RC9"
}
lazy val commonDependencies = Seq(
dependencies.kyro,
dependencies.munit
)
lazy val root = (project in file("."))
.settings(
libraryDependencies ++= commonDependencies,
Test / parallelExecution := false
)
addCompilerPlugin("co.blocke" %% "dotty-reflection" % reflectionLibVersion)
But when I compile my target project, I get a java.lang.NoClassDefFoundError that it can't find Kyro. I've added kyro to my dependencies, but since this is for the compiler, not my app, it's not picking that up.
How can I properly tell sbt about a dependency my plugin needs?
I can sbt assembly myself a fat jar without an issue with the below build.sbt file. However when I try to publish this "fat jar", sbt publish dumps only 1kb .jar files in the s3 bucket.
Unzipping the .jar file shows that it only contains a manifest file.
How do I get the fat jar into my repo?
update: striked text has been altered since initial question was posed. Removed the name override and it now publishes the build code but without the external libraries
below, my build.sbt file
name := "util_myutil"
version := "1.0.1"
scalaVersion := "2.10.4"
scalacOptions += "-target:jvm-1.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.5.0-cdh5.5.2" % "provided"
unmanagedJars in Compile += file(".lib/my.jar")
unmanagedJars in Compile += file(".lib/some_other.jar")
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyJarName in assembly := s"${name.value}-${version.value}.jar"
ivyScala := ivyScala.value map { _.copy(overrideScalaVersion = true) }
resolvers ++= Seq(
"Cloudera repos" at "https://repository.cloudera.com/artifactory/cloudera-repos",
"Cloudera releases" at "https://repository.cloudera.com/artifactory/libs-release",
"Era7 maven releases" at "https://s3-eu-west-1.amazonaws.com/releases.era7.com"
)
s3sse := true
s3region := com.amazonaws.services.s3.model.Region.US_Standard
s3acl := com.amazonaws.services.s3.model.CannedAccessControlList.Private
s3overwrite := true
publishMavenStyle := true
publishTo := {
val suffix = if (isSnapshot.value) "snapshots" else "releases"
Some(s3resolver.value(s"IT Insights Artifacts $suffix", s3("my-mvn-repo." + suffix)))
}
from https://github.com/sbt/sbt-assembly:
add this to your build.sbt:
artifact in (Compile, assembly) := {
val art = (artifact in (Compile, assembly)).value
art.copy(`classifier` = Some("assembly"))
}
addArtifact(artifact in (Compile, assembly), assembly)
How can I modify the output of the final packaged zip to move the "lib" directory contents up one level. Basically I output a zip and the contents are like so:
ZIP FILE CONTENT:
-- my-plugin-1.0.jar
-- /lib
-- /lib/mydependency1.jar
-- /lib/mydependency2.jar
ZIP FILE CONTENT I WISH TO HAVE:
-- my-plugin-1.0.jar
-- mydependency1.jar
-- mydependency2.jar
I want to move everything in "lib" up one level to the root output.
sbt version 0.13.0
Here is my build.sbt:
import NativePackagerHelper._
organization := "com.company.product"
name := "my-plugin"
version := "1.0"
enablePlugins(UniversalPlugin)
packageName in Universal:= "deployment"
publishArtifact in (Compile, packageDoc) := false
artifactName := {
(sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
artifact.name + "-" + module.revision + "." + artifact.extension
}
javacOptions ++= Seq("-source", "1.8")
mappings in Universal <+= packageBin in Compile map { jar => jar -> (jar.getName()) }
topLevelDirectory := None
plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
command line:
sbt universal:packageBin
Looks like your requirement is a first class citizen in sbt-native-packager:
mappings in Universal ++= contentOf("src/main/resources/cache")
This is a pretty noob question.
I'm trying to learn about SparkSQL. I've been following the example described here:
http://spark.apache.org/docs/1.0.0/sql-programming-guide.html
Everything works fine in the Spark-shell, but when I try to use sbt to build a batch version, I get the following error message:
object sql is not a member of package org.apache.spark
Unfortunately, I'm rather new to sbt, so I don't know how to correct this problem. I suspect that I need to include additional dependencies, but I can't figure out how.
Here is the code I'm trying to compile:
/* TestApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
case class Record(k: Int, v: String)
object TestApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
val data = sc.parallelize(1 to 100000)
val records = data.map(i => new Record(i, "value = "+i))
val table = createSchemaRDD(records, Record)
println(">>> " + table.count)
}
}
The error is flagged on the line where I try to create a SQLContext.
Here is the content of the sbt file:
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
Thanks for the help.
As is often the case, the act of asking the question helped me figure out the answer. The answer is to add the following line in the sbt file.
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.0.0"
I also realized there is an additional problem in the little program above. There are too many arguments in the call to createSchemaRDD. That line should read as follows:
val table = createSchemaRDD(records)
Thanks! I ran into a similar problem while building a Scala app in Maven. Based on what you did with SBT, I added the corresponding Maven dependencies as follows and now I am able to compile and generate the jar file.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.2.1</version>
</dependency>
I got the similar issue, in my case, i just copy pasted the below sbt setup from online with scalaVersion := "2.10.4" but in my environment, i actually have the scala version 2.11.8
so updated & executed sbt package again, issue fixed
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
Has anyone published an sbt-native-packager produced artifact (tgz in my case) using sbt-aether-deploy to a nexus repo? (I need this for the timestamped snapshots, specifically the "correct" version tag in nexus' artifact-resolution REST resource).
I can do one or the other but can't figure out how to add the packagedArtifacts in Universal to the artifacts that sbt-aether-deploy deploys to do both.
I suspect the path to pursue would be to the addArtifact() the packagedArtifacts in Universal or creating another AetherArtifact and then to override/replace the deployTask to use that AetherArtifact?
Any help much appreciated.
I am the author of the sbt-aether-deploy plugin, and I just came over this post.
import aether.AetherKeys._
crossPaths := false //needed if you want to remove the scala version from the artifact name
enablePlugins(JavaAppPackaging)
aetherArtifact := {
val artifact = aetherArtifact.value
artifact.attach((packageBin in Universal).value, "dist", "zip")
}
This will also publish the other main artifact.
If you want to disable publishing of the main artifact, then you will need to rewrite the artifact coordinates. Maven requires a main artifact.
I have added a way to replace the main artifact for this purpose, but I can now see that way is kind of flawed. It will still assume that the artifact is published as a jar file. The main artifact type is locked down to that, since the POM packaging is set to jar by default by SBT.
If this is an app, then that limitation is probably OK, since Maven will never resolve that into an artifact.
The "proper" way in Maven terms is to add a classifier to the artifact and change the "packaging" in the POM file to "pom". We will see if I get around to changing that particular part.
Ok, I think I got it amazingly enough. If there's a better way to do it I'd love to hear. Not loving that blind Option.get there..
val tgzCoordinates = SettingKey[MavenCoordinates]("the maven coordinates for the tgz")
lazy val myPackagerSettings = packageArchetype.java_application ++ deploymentSettings ++ Seq(
publish <<= publish.dependsOn(publish in Universal),
publishLocal <<= publishLocal.dependsOn(publishLocal in Universal)
)
lazy val defaultSettings = buildSettings ++ Publish.settings ++ Seq(
scalacOptions in Compile ++= Seq("-encoding", "UTF-8", "-target:jvm-1.7", "-deprecation", "-feature", "-unchecked", "-Xlog-reflective-calls"),
testOptions in Test += Tests.Argument("-oDF")
)
lazy val myAetherSettings = aetherSettings ++ aetherPublishBothSettings
lazy val toastyphoenixProject = Project(
id = "toastyphoenix",
base = file("."),
settings = defaultSettings ++ myPackagerSettings ++ myAetherSettings ++ Seq(
name in Universal := name.value + "_" + scalaBinaryVersion.value,
packagedArtifacts in Universal ~= { _.filterNot { case (artifact, file) => artifact.`type`.contains("zip")}},
libraryDependencies ++= Dependencies.phoenix,
tgzCoordinates := MavenCoordinates(organization.value + ":" + (name in Universal).value + ":tgz:" + version.value).get,
aetherArtifact <<= (tgzCoordinates, packageZipTarball in Universal, makePom in Compile, packagedArtifacts in Universal) map {
(coords: MavenCoordinates, mainArtifact: File, pom: File, artifacts: Map[Artifact, File]) =>
createArtifact(artifacts, pom, coords, mainArtifact)
}
)
)
I took Peter's solution and reworked it slightly, avoiding the naked Option.get by creating the MavenCoordinates directly:
import aether.MavenCoordinates
import aether.Aether.createArtifact
name := "mrb-test"
organization := "me.mbarton"
version := "1.0"
crossPaths := false
packageArchetype.java_application
publish <<= (publish) dependsOn (publish in Universal)
publishLocal <<= (publishLocal) dependsOn (publishLocal in Universal)
aetherPublishBothSettings
aetherArtifact <<= (organization, name in Universal, version, packageBin in Universal, makePom in Compile, packagedArtifacts in Universal) map {
(organization, name, version, binary, pom, artifacts) =>
val nameWithoutVersion = name.replace(s"-$version", "")
createArtifact(artifacts, pom, MavenCoordinates(organization, nameWithoutVersion, version, None, "zip"), binary)
}
The nameWithoutVersion replace works around SBT native packager including the version in the artifact name:
Before: me/mbarton/mrb-test-1.0/1.0/mrb-test-1.0.zip
After: me/mbarton/mrb-test/1.0/mrb-test-1.0.zip
crossPaths avoids the Scala postfix on the version.