How does sbt filter dependencies - sbt

I'm trying to add appengine-maven-plugin to my plugin as a compile time dependency.
If I run
show update
in the sbt console the dependecy is there, but if I run
show managedClasspath
it is not there anymore and I cannot figure out why.
Here is a minimal build.sbt file which shows this behaviour.
sbtPlugin := true
name := "test1"
organization := "com.cornim"
version := "0.1"
libraryDependencies += "com.google.appengine" % "appengine-maven-plugin" % "1.9.51"
Any ideas would be much appreciated.
P.S.: I'm on sbt v0.13.15

Related

sbt aspectj with native packager

I'm attempting to use the sbt-aspectj plugin with the sbt native packager and am running into an issue where the associated -javaagent path to the aspectj load time weaver jar references an ivy cache location rather than something packaged.
That is, after running sbt stage, executing the staged application via bash -x target/universal/stage/bin/myapp/ results in this javaagent:
exec java -javaagent:/home/myuser/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.10.jar -cp /home/myuser/myproject/target/universal/stage/lib/org.aspectj.aspectjweaver-1.8.10.jar:/home/myuser/myproject/target/universal/stage/lib/otherlibs.jar myorg.MyMainApp args
My target platform is Heroku where the artifacts are built before being effectively 'pushed' out to individual 'dynos' (very analogous to a docker setup). The issue here is that the resulting -javaagent path was valid on the machine in which the 'staged' deployable was built, but will not exist where it's ultimately run.
How can one configure the sbt-aspectj plugin to reference a packaged lib rather than one from the ivy cache?
Current configuration:
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.10.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.5")
build.sbt (selected parts):
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
javaOptions in Runtime ++= { AspectjKeys.weaverOptions in Aspectj }.value,
// see: https://github.com/sbt/sbt-native-packager/issues/598#issuecomment-111584866
javaOptions in Universal ++= { AspectjKeys.weaverOptions in Aspectj }.value
.map { "-J" + _ },
fork in run := true
)
Update
I've tried several approaches including pulling the relevant output for javaOptions from existing mappings, but the result is a cyclical dependency error thrown by sbt.
I have something that technically solves my problem but feels unsatisfactory. As of now, I'm including an aspectjweaver dependency directly and using the sbt-native-packager concept of bashScriptExtraDefines to append an appropriate javaagent:
updated build.sbt:
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
bashScriptExtraDefines += scriptClasspath.value
.filter(_.contains("aspectjweaver"))
.headOption
.map("addJava -javaagent:${lib_dir}/" + _)
.getOrElse(""),
fork in run := true
)
You can add the following settings in your sbt config:
.settings(
retrieveManaged := true,
libraryDependencies += "org.aspectj" % "aspectjweaver" % aspectJWeaverV)
AspectJ weaver JAR will be copied to ./lib_managed/jars/org.aspectj/aspectjweaver/aspectjweaver-[aspectJWeaverV].jar in your project root.
I actually solved this by using the sbt-javaagent plugin to adding agents to the runtime

How to successfully import a CrossProject sbt build into eclipse using sbt-eclipse

I've used sbt-eclipse in the past to successfully import a simple sbt project into eclipse. I'm now trying to leverage the CrossProject mechanism of sbt to use the Scala-JS environment (makes 2 subprojects in sbt--one for Javascript and one for JVM code). The recommendation (see SBT docs link here) is to add the setting 'EclipseKeys.useProjectId := true' in the build.sbt file to support importing (now) 2 projects into one eclipse project. I then give the 'eclipse' command in a running SBT session to create my eclipse project and then launch eclipse and attempt to import this new project. When I do this, the import dialog wizard in eclipse does show me two sub-projects, but when I try to finish the import, eclipse complains that the project already exists and I get two strange looking links in my eclipse project that seem to do nothing.
What is the correct procedure for getting a CrossProject sbt build into eclipse?
Ok, so it seems eclipse did not like that I had only one 'name' for the project that was in the shared settings area of the build.sbt I had this:
lazy val sp = crossProject.in(file(".")).
settings(
version := "0.1",
name := "SJSTut",
scalaVersion := "2.11.7"
).
jvmSettings(
// Add JVM-specific settings here
libraryDependencies ++= Seq(...)
).
jsSettings(
// Add JS-specific settings here
libraryDependencies ++= Seq(...)
)
and what I should have done was this:
lazy val sp = crossProject.in(file(".")).
settings(
version := "0.1",
scalaVersion := "2.11.7"
).
jvmSettings(
// Add JVM-specific settings here
name := "SJSTutJVM",
libraryDependencies ++= Seq(...)
).
jsSettings(
// Add JS-specific settings here
name := "SJSTutJS",
libraryDependencies ++= Seq(...)
)
Note the removal of the 'name' assignment from settings and instead, placements into both the jvmSettings and jsSettings area with uniquely different names.
Now I'm able to pull this into eclipse (as 2 separate projects). If anyone else has a better setup, I'd love to hear about it.

Missing classes from the assembly file created by sbt assembly

I have a project that uses sbt 0.13.6 with the assembly 0.12.0 plugin to create the farJar. My build.sbt is:
name := "test"
version := "0.0.1"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
("org.apache.kafka" % "kafka_2.10" % "0.8.0" % "provided").
exclude("javax.jms", "jms").
exclude("com.sun.jdmk", "jmxtools").
exclude("com.sun.jmx", "jmxri").
exclude("org.slf4j", "slf4j-simple")
)
When I run sbt assembly I get a file called target/scala-2.10/test-assembly-0.0.1.jar but it is missing some kafka classes, included the one that I need at runtime:
> diff <(jar -tf /home/rief/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.0.jar) <(jar -tf target/scala-2.10/test-assembly-0.0.1.jar) | grep "^<"
...
kafka/consumer/ZookeeperConsumerConnector$ZKRebalancerListener$$anonfun$kafka$consumer$ZookeeperConsumerConnector$ZKRebalancerListener$$closeFetchersForQueues$1.class
...
Is this a correct behaviour? How can I include kafka in my fatJar?
That's the intended behavior. % "provided" is skipped, since it's intention is to provide those classes from the container like Apache Spark, Kafka etc.
If you want everything in it here's what you can do:
fullClasspath in assembly := (fullClasspath in Compile).value

Why does Play build give "No trace dependencies for Activator" after adding plugins?

I started with the latest Typesafe Activator download using the play-scala template.
Activator 1.2.10
Akka 2.3.4
Play 2.3.4
Scala 2.11.1
Then I modified build.sbt file to add play2-reactivemongo with
"org.reactivemongo" % "play2-reactivemongo_2.11" % "0.10.5.akka23-SNAPSHOT"
, but it failed with No trace dependencies for Activator.
I removed play2-reactivemongo and tried play-silhouette and received the same error.
"com.mohiva" % "play-silhouette_2.11" % "1.1-SNAPSHOT"
The app builds with neither plugin added.
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.1"
resolvers += "Sonatype Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
libraryDependencies ++= Seq(
// "com.mohiva" % "play-silhouette_2.11" % "1.1-SNAPSHOT",
// "org.reactivemongo" % "play2-reactivemongo_2.11" % "0.10.5.akka23-SNAPSHOT",
jdbc,
anorm,
cache,
ws
)
The output from Play Framework tells me nothing beyond that one line as best I can tell. Maybe there is better information leading to a solution, but I have not been able to find it. Any ideas?
CORRECTION: now when I disable the resolver line, disable the play-silhouette line, and disable the reactivemongo line, the same error message appears. Yet, it once compiled successfully.
The error is from the sbt-echo plugin here:
https://github.com/typesafehub/sbt-echo/blob/3f431a9748a45fcb328efe4d5f989a99b5c8f7f2/akka/src/main/scala/com/typesafe/sbt/echo/EchoRun.scala#L95
I improved this error message just the other day, incidentally, but you don't have it yet:
https://github.com/typesafehub/sbt-echo/blob/master/akka/src/main/scala/com/typesafe/sbt/echo/EchoRun.scala#L118
activator's ui mode (activator ui) adds the sbt-echo plugin in order to power the Inspect tab. You can remove the plugin again (by deleting the .sbt file for it in project/) if you are not currently using UI mode, to fix this.
If you are using UI mode, to fix this you need akka and play to have versions which are understood by sbt-echo. This may mean downgrading to 2.3.3 for now, we are a little behind on updating the tracing.

Unresolved dependency: net.sourceforge.htmlunit in SBT

My build.sbt has the following content:
name := "hello-world"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "net.sourceforge.htmlunit" %% "htmlunit" % "2.13"
When I perform update in sbt console it says:
[error] (*:update) sbt.ResolveException: unresolved dependency: net.sourceforge.htmlunit#htmlunit_2.10;2.13: not found
What should I do to make sbt find this library?
Just use a single % instead of double %% in the dependency.
libraryDependencies += "net.sourceforge.htmlunit" % "htmlunit" % "2.13"
%% is only required when the path of the jar contains Scala version which is not a case for the dependency. I figured it out consulting mvnrepository - http://mvnrepository.com/artifact/net.sourceforge.htmlunit/htmlunit/2.13. Just hover over the 'Download(JAR)' link and you can see the full path.
Note: By default sbt uses the standard Maven2 repository. In case you have dependent jars that cannot be found in the default repo, then you need to add custom resolvers like this
resolvers += "custom_repo" at "url"
For this particular example resolvers are not required since htmlunit is present in default repo.

Resources