%% doesn't work but % does in library dependencies - sbt

I noticed that if I use %% then the library I want to download doesn't get downloaded but using only % works. Why?
"org.mockito" %% "mockito-core" % "2.18.3" % "test" doesn't work. I get error sbt.ResolveException: unresolved dependency: org.mockito#mockito-core_2.12;2.18.3: not found
but "org.mockito" % "mockito-core" % "2.18.3" % "test" works.

That's because mockito is a java library. Use %% only when downloading scala library. %% is a shortcut, sbt replaces it with your project's scala version to the artifact name.

Related

Migration to sbt 1: how to run Scalastyle at compilation time

I had scalastyle running at compilation time in my project. Since I updated from sbt 0.13 to sbt 1.0.1 I don't manage to make it work again.
I followed the documentation from here and added this to my build.sbt:
lazy val compileScalaStyle: TaskKey[Unit] = taskKey[Unit]("scalastyle")
compileScalastyle := scalastyle.in(Compile).toTask("").value,
(compile in Compile) := ((compile in Compile) dependsOn compileScalastyle).value,
But I get this error:
not found: value scalastyle
Do I need an import? If yes, I didn't manage to find it.
You should not need a special import. There seems to be a typo in compileScalaStyle. Try
lazy val compileScalastyle = taskKey[Unit]("compileScalastyle")
instead of
lazy val compileScalaStyle: TaskKey[Unit] = taskKey[Unit]("scalastyle")
Here is a working example project using Scalastyle 1.0.0 with SBT 1.0.4.

sbt-proguard issue with Java 1.8

I'm trying to get smaller scalar executable jar file with sbt-proguard.
I added project/plugin.sbt these two lines of code:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-proguard" % "0.2.2")
The first one is to get uberjar file, and I could get uberjar with sbt assembly that works fine.
Then, I executed sbt proguard:proguard to get this error message.
[error] Error: Can't read [/Library/Java/JavaVirtualMachines/jdk1.8.0_25.jdk/Contents/Home/jre/lib/rt.jar] (Can't process class [apple/applescript/AppleScriptEngine.class] (Unsupported class version number [52.0] (maximum 51.0, Java 1.7)))
java.lang.RuntimeException: Proguard failed with exit code [1]
at scala.sys.package$.error(package.scala:27)
...
at java.lang.Thread.run(Thread.java:745)
[error] (proguard:proguard) Proguard failed with exit code [1]
From the hint from this post: ProGuard says Unsupported class version number [52.0] (maximum 51.0, Java 1.7) with sbt-proguard,
I switched to both Java 1.7 and Java 1.6 with export JAVA_HOME=/usr/libexec/java_home -v '1.6*' command to run proguard to get the slim-lined jar file, but this doesn't run.
Invalid or corrupt jarfile target/scala-2.11/proguard/myproject_2.11-1.0.jar
What might be wrong? These are the lines that are added to build.sbt.
proguardSettings
ProguardKeys.options in Proguard ++= Seq("-dontnote", "-dontwarn", "-ignorewarnings")
ProguardKeys.options in Proguard += ProguardOptions.keepMain("core.HelloWorld")
I believe this is documented in the pro guard docs.
Running your application with java -classpath <jarpath> --class classname <program-arguments> should work.
This happens because pro guard by default removes all MANIFEST files from the jar hence the java runtime cannot find the jar class entries. Another way to do this would be to keep the MANIFEST.md file and run it using the java -jar option but I have never tried that.
Define a recent Proguard version that supports Java 1.8
ProguardKeys.proguardVersion in Proguard := "5.3.3"
Also a couple of useful ones if you run out of mem are
javaOptions in (Proguard, ProguardKeys.proguard) := Seq("-Xmx2G")
javaOptions in (Proguard, ProguardKeys.proguard) += "-Xss1G"

Spark: A signature in package.class refers to type compileTimeOnly

When trying to build MLlib example with Spark 1.2.1 using SBT I get the whole bunch of strange compilation errors. The same code builds fine with Spark 1.1.0. For Spark 1.2.1 I use the following SBT build file:
name := "Test"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "1.2.1" % "provided"
As a result I get the following set of strange errors:
[info] Compiling 1 Scala source to /home/test/target/scala-2.10/classes...
[error] bad symbolic reference. A signature in package.class refers to type compileTimeOnly
[error] in package scala.annotation which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling package.class.
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] /home/test/src/main/scala/Test.scala:16: Reference to method augmentString in object Predef should not have survived past type checking,
[error] it should have been processed and eliminated during expansion of an enclosing macro.
[error] val parsedData = data.map(s => Vectors.dense(s.split(' ').map(_.toDouble))).cache()
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 21 s, completed 26.02.2015 17:47:29
How to fix this? It would be great if somebody could post a general SBT to build Spark 1.2.1 + MLlib code. Thanks!
Try changing the libraryDependencies line to the following:
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.2.1" % "provided"
You are using Scala 2.10.4 and you are trying to install the Spark library for Scala 2.11.x - the %% will automatically select the correct Scala library version for you.
I am using IntelliJ to compile spark 1.6.0 code. And faced the same errors. [ERROR] error: bad symbolic reference. A signature in package.class refers to type compileTimeOnly.
I solve this problem by adding Scala language related dependencies to the project. Maybe maven can't use the Scala config from Intellij. So we should explicitly specify the Scala dependencies.
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>2.10.6</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.6</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.10.6</version>
</dependency>

Setting up Visualization Toolkit (VTK) as project dependency in sbt?

I'm on Ubuntu 13.10.
I'd like to run Visualization Toolkit (VTK) in my sbt project.
I've installed the libvtk over synaptic package manager. I can run my class with the VTK example without problems. But when I try to run it over the SBT via the run command, I get an UnsatisfiedLinkError.
I tried several things. This means adding vtk.jar into the lib folder of the project and trying several sbt dependencies for vtk from Artenum.
I like to run the whole stuff in sbt, because I use some other stuff for testing and so on. Only compiling the class with the vtk dependency inside is no option for me.
Setting paths and so on I tried also. But I mean, I can install a whole lot of other dependencies without setting paths in my OS.
An alternative could be to find a working graphical tool like vtk which is good to run as a dependency in Java/Scala.
This is my build.sbt:
name := "selfoo"
version := "1.0"
scalaVersion := "2.10.3"
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature")
resolvers ++= Seq(
"Sonatype OSS" at "https://oss.sonatype.org/content/repositories/releases",
"Typesafe" at "http://repo.typesafe.com/typesafe/releases",
"artenum" at "http://maven.artenum.com/content/groups/public"
)
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.2.1",
"com.typesafe.akka" %% "akka-remote" % "2.2.1",
"com.typesafe.slick" %% "slick" % "1.0.1",
"net.liftweb" %% "lift-json" % "2.5.1",
// "com.h2database" %% "h2" % "1.3.166",
// "com.googlecode.lanterna" % "lanterna" % "2.1.6",
"commons-net" % "commons-net" % "3.3",
"jline" % "jline" % "2.11",
"org.apache.mina" % "mina-core" % "2.0.4",
"org.apache.ftpserver" % "ftplet-api" % "1.0.6",
"org.apache.ftpserver" % "ftpserver-core" % "1.0.6",
"org.slf4j" % "slf4j-api" % "1.6.4",
"org.scala-lang" % "scala-swing" % "2.10+",
"vtk" % "vtk" % "5.8.0",
"org.slf4j" % "slf4j-simple" % "1.6.4")
And this is the stacktrace i get:
[info] Compiling 1 Scala source to /root/IdeaProjects/selfoo/target/scala-2.10/classes...
[info] Running ScalaCone
[error] (run-main) java.lang.UnsatisfiedLinkError: vtk.vtkPoints.VTKInit()J
java.lang.UnsatisfiedLinkError: vtk.vtkPoints.VTKInit()J
at vtk.vtkPoints.VTKInit(Native Method)
at vtk.vtkObject.<init>(vtkObject.java:97)
at vtk.vtkPoints.<init>(vtkPoints.java:166)
at ScalaCone$.<init>(ScalaCone.scala:12)
at ScalaCone$.<clinit>(ScalaCone.scala)
at ScalaCone.main(ScalaCone.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run 'last compile:run' for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run 'last compile:run' for the full output.
[error] (compile:run) Nonzero exit code: 1
[error] Total time: 3 s, completed 21.03.2014 13:13:02
I try to run the example ScalaCone from http://ij-plugins.sourceforge.net/vtk-examples/ScalaCone.html
As of today, vtk is not listed on maven-central. Due to its machine-dependent parts I doubt that it ever will be (since having compiled object code in a jar file somewhat violates the idea of platform independent code - yes I am aware of things like swt).
That being said, the simplest solution would be to include the vtk java interface directly in your project as a thirdparty element.
If you look into the wrapping/java folder of the source distribution, you will find some *.in files that contain macros expanded by cmake. These you would have to adapt in a way to use your system-installed vtk library. The rest is just plain java and you should be able to reuse it.
I am aware that this is not the best solution, but it should allow you to remain fairly platoform independent (you only have to load this shared lib). On the other hand, if platform independence is not your concern, I humbly suggest to drop Java in the first place ;).

error: eof expected?! How to use idea and eclipse plugins together in sbt?

I use sbt 0.13.
Both https://github.com/typesafehub/sbteclipse and https://github.com/typesafehub/sbt-idea suggest to add a line for each to ~/.sbt/plugins/build.sbt.
Thus my plugins/build.sbt looks like:
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.5.1")
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.3.0")
With that, sbt keeps failing with the error:
.sbt/0.13/plugins/build.sbt:2: error: eof expected but ';' found.
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.3.0")
^
[error] Error parsing expression. Ensure that settings are separated by blank lines.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? zsh: exit 130 sbt
Interestingly, both lines work seperately.
Is it possible to use both plugins?
According to How build.sbt defines settings you need to put a blank line between Scala expressions.
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.5.1")
# blank line here
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "2.3.0")
Note that you need SBT 0.13.0 for sbteclipse 2.3.0 and sbt-idea is currently for SBT 0.12.x.

Resources