AspectJ weaving external jar file provided by sbt throws can't determine superclass of missing type - sbt

I'm trying to weave the testing library scalatest (https://mvnrepository.com/artifact/org.scalatest/scalatest_2.12/3.2.0-SNAP10). This library dependency is in my build.sbt:
enablePlugins(SbtAspectj)
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.5"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
aspectjInputs in Aspectj ++= update.value.matching(moduleFilter(organization = "com.typesafe.akka", name = "akka-actor*"))
aspectjInputs in Aspectj ++= update.value.matching(moduleFilter(organization = "org.scalatest", name = "scalatest*"))
fullClasspath in Runtime := aspectjUseInstrumentedClasses(Runtime).value ++ aspectjUseInstrumentedClasses(Test).value
Looking at the maven website it lists several other optional dependencies such as org.jmock, etc.
The problem is that SBT only downloads scalatest.jar and not jmock.jar (along the other optional dependencies). Printing out the aspectjInputs indeed does show scalatest.jar, but not jmock.jar.
Because of that reason (?), it gives me the following errors:
[error] error at (no source information available)
[error] /Users/jonas/.ivy2/cache/org.scalatest/scalatest_2.12/bundles/scalatest_2.12-3.0.5.jar:0::0 can't determine superclass of missing type org.jmock.Expectations
[error] when weaving type org.scalatest.jmock.JMockExpectations
[error] when weaving classes
[error] when weaving
[error] when batch building BuildConfig[null] #Files=1 AopXmls=#0
[error] [Xlint:cantFindType]
I'm assuming that I somehow need the .jar file of the optional dependencies of ScalaTest, but as they are not downloaded by sbt, I'm lost on how to solve this.
So, how can I resolve them or add them to the classpath when the weaving happens?
I'm using the aspectj-sbt plugin.

You can include the optional dependencies with a configuration mapping like this:
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test->compile,optional"

Related

Minimal sbt-web pipeline without Play

I'm using spray to create a single page app, and cannot get sbt-web to process any of my inputs. I started with WebJars, because, https://github.com/sbt/sbt-web/blob/master/README.md says:
One last thing regarding the public and public-test folders... any WebJar depended on by the project will be automatically extracted into these folders e.g. target/web/public/lib/jquery/jquery.js.
However, I see no such "web" folder in the target folder. I thought maybe WebJars was too complicated of an example to start with, so I instead added a jquery.js file to the root of the asset folder, and set up sbt-uglify to do some processing on it. Yet, still, I see no evidence that SbtWeb is working. I've run sbt --debug clean run and grepped the output for any output from SbtWeb or Uglify, but no errors or warnings and can't find anything wrt SbtWeb or Uglify. Just acknowledgement that it seems to "deduce" the plugins:
[debug] deducing auto plugins based on known facts [debug] :: sorting:: found:
...
[debug] :: sorted deduced result: List(sbt.plugins.CorePlugin, com.typesafe.sbt.web.SbtWeb, com.typesafe.sbt.jse.SbtJsEngine, net.ground5hark.sbt.concat.SbtConcat, sbt.plugins.IvyPlugin, com.typesafe.sbt.jse.SbtJsTask, sbt.plugins.JvmPlugin, com.typesafe.sbt.uglify.SbtUglify, sbt.plugins.JUnitXmlReportPlugin)
Here is my directory structure:
./build.sbt
./project/plugins.sbt
./src/main/assets/js/jquery.js
./src/main/resources/html/uikit/login.html
./src/main/scala/Boot.scala
./src/main/scala/main.scala
Here is my project/plugins.sbt:
resolvers += Resolver.sonatypeRepo("releases")
addSbtPlugin("com.typesafe.sbt" % "sbt-web" % "1.0.2")
addSbtPlugin("net.ground5hark.sbt" % "sbt-concat" % "0.1.8")
addSbtPlugin("com.typesafe.sbt" % "sbt-uglify" % "1.0.3")
Here is my ./build.sbt:
organization := "com.test123.spray"
version := "0.1"
scalaVersion := "2.11.6"
scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")
libraryDependencies ++= {
val akkaV = "2.3.9"
val sprayV = "1.3.3"
Seq(
"io.spray" %% "spray-can" % sprayV,
"io.spray" %% "spray-routing" % sprayV,
"io.spray" %% "spray-testkit" % sprayV % "test",
"com.typesafe.akka" %% "akka-actor" % akkaV,
"com.typesafe.akka" %% "akka-testkit" % akkaV % "test",
// client side dependencies
"org.webjars" % "jquery" % "2.1.4",
"org.webjars" % "uikit" % "2.24.2"
)
}
lazy val root = (project.in(file("."))).enablePlugins(SbtWeb)
pipelineStages := Seq(uglify)
includeFilter in uglify := GlobFilter("js/*.js")
Here is what the root of my ./target folder looks like:
resolution-cache/
scala-2.11/
streams/
No ./target/web folder. Any ideas why?
References:
https://github.com/sbt/sbt-web/blob/master/README.md
http://mariussoutier.com/blog/2014/12/07/understanding-sbt-sbt-web-settings/
That'll learn me. I was using an old version of sbt-web. When I update it to the latest, it works as expected.
The lesson is not to copy/paste snippets like this:
addSbtPlugin("com.typesafe.sbt" % "sbt-web" % "1.0.2")
from the web. But, rather to find the latest version, manually, by one of the following methods:
If the GitHub (et al) page has a "Build Passing" badge, click on it to navigate to the build server where the latest versions are listed.
Look at the branches in GitHub
See if you can navigate to the repository for the dependency, such as Maven Central, and browse there. I didn't have luck with this one. The problem I had is that I knew it wasn't on Maven, and didn't know where else to look.
Plug in some bogus version in SBT, and look at the output for where SBT tried to look and failed:
[warn] module not found: com.typesafe.sbt#sbt-web;3.1.2
[warn] ==== typesafe-ivy-releases: tried
[warn] http://repo.typesafe.com/typesafe/ivy-releases/com.typesafe.sbt/sbt-web/scala_2.10/sbt_0.13/3.1.2/ivys/ivy.xml
[warn] ==== sbt-plugin-releases: tried
[warn] http://repo.scala-sbt.org/scalasbt/sbt-plugin-releases/com.typesafe.sbt/sbt-web/scala_2.10/sbt_0.13/3.1.2/ivys/ivy.xml
I'm positive there are better ways to find the latest versions of things that I'm just unaware of. For those more experienced than me please comment with a better way.

sbt assembly command not found

I'm trying to run sbt assembly. According to https://github.com/sbt/sbt-assembly , for sbt 0.13.6+ (I'm on 0.13.7) this should be included automatically for anything with the JvmPlugin. According to sbt plugins I have the JvmPlugin enabled in root. When I run sbt assembly I get "Not a valid commamdn: assembly". I've tried using old methods of including sbt-assembly with all the different types of sbt configurations, but none seem to work. Here's what my build files look like (note sbt package works fine)
assembly.sbt
addSbtPlugin("com.eed3si8n" % "sbt-assembly" % "0.13.0")
build.sbt
lazy val commonSettings = Seq(
organization := "com.test",
version := "1.0",
scalaVersion := "2.10.4"
)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "test",
resolvers ++= Seq(
...
),
libraryDependencies ++= Seq(
...
)
)
Here is the error:
[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error]
Any ideas? Running on Linux. Thanks
Did you create a assembly.sbt at the root of your project? Alongside your build.sbt?
If so, then that's the problem. You want to have it inside the project directory.
Having done that it worked out the box as expected with the rest of your setup:
> assembly
[info] Including: scala-library.jar
[info] Checking every *.class/*.jar file's SHA-1.
[info] Merging files...
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard'
[warn] Strategy 'discard' was applied to a file
[info] SHA-1: 1ae0d7a9c433e439e81ce947659bf05aa62a2892
[info] Packaging /Users/dnw/Desktop/t-2015-04-08.2340/target/scala-2.10/test-assembly-1.0.jar ...
[info] Done packaging.
[success] Total time: 2 s, completed 08-Apr-2015 23:45:59
Since the introduction of auto plugins in 0.13.5, adding explicit .sbt files for plugins (except for specific cases where the plugin does not implement auto-plugin trait) is not recommended per sbt documentation.
Add the addSbtPlugin("com.eed3si8n" % "sbt-assembly" % "0.13.0") back to plugins.sbt under project directory and remove assembly.sbt. if you still see the error, explicitly enable the plugin in the build.sbt:
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
name := "test",
).
enablePlugins(AssemblyPlugin)
lazy val root = (project in file(".")).
settings(commonSettings: _*).
settings(
assemblySettings ++ Seq(
jarName in assembly := "roobricks-spark.jar",
test in assembly := {}
).
enablePlugins(AssemblyPlugin)
can you once with this.
Same thing happened to me. Move assembly.sbt from the root to inside your project/ directory
Came across the same error. The reason was I executing it from the wrong inside target folder
You should normally have a plugins.sbt file at the root level alongside your build.properties where you should have the following:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.15.0")
From sparkour:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.4") with assembly.plugin does work

Missing classes from the assembly file created by sbt assembly

I have a project that uses sbt 0.13.6 with the assembly 0.12.0 plugin to create the farJar. My build.sbt is:
name := "test"
version := "0.0.1"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
("org.apache.kafka" % "kafka_2.10" % "0.8.0" % "provided").
exclude("javax.jms", "jms").
exclude("com.sun.jdmk", "jmxtools").
exclude("com.sun.jmx", "jmxri").
exclude("org.slf4j", "slf4j-simple")
)
When I run sbt assembly I get a file called target/scala-2.10/test-assembly-0.0.1.jar but it is missing some kafka classes, included the one that I need at runtime:
> diff <(jar -tf /home/rief/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.0.jar) <(jar -tf target/scala-2.10/test-assembly-0.0.1.jar) | grep "^<"
...
kafka/consumer/ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$closeFetchersForQueues$1.class
...
Is this a correct behaviour? How can I include kafka in my fatJar?
That's the intended behavior. % "provided" is skipped, since it's intention is to provide those classes from the container like Apache Spark, Kafka etc.
If you want everything in it here's what you can do:
fullClasspath in assembly := (fullClasspath in Compile).value

Excluding dependency transitively for assembly only? [duplicate]

Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.
For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.
Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)
If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value

Unresolved dependency: net.sourceforge.htmlunit in SBT

My build.sbt has the following content:
name := "hello-world"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "net.sourceforge.htmlunit" %% "htmlunit" % "2.13"
When I perform update in sbt console it says:
[error] (*:update) sbt.ResolveException: unresolved dependency: net.sourceforge.htmlunit#htmlunit_2.10;2.13: not found
What should I do to make sbt find this library?
Just use a single % instead of double %% in the dependency.
libraryDependencies += "net.sourceforge.htmlunit" % "htmlunit" % "2.13"
%% is only required when the path of the jar contains Scala version which is not a case for the dependency. I figured it out consulting mvnrepository - http://mvnrepository.com/artifact/net.sourceforge.htmlunit/htmlunit/2.13. Just hover over the 'Download(JAR)' link and you can see the full path.
Note: By default sbt uses the standard Maven2 repository. In case you have dependent jars that cannot be found in the default repo, then you need to add custom resolvers like this
resolvers += "custom_repo" at "url"
For this particular example resolvers are not required since htmlunit is present in default repo.

Resources