It seems sbt is ignoring my Java test in my Akka application, probably for a good reason I wouldn't know of, as I am not well versed in Junit, which Akka Testkit relies on. Is my code missing some necessary ceremony?
import akka.testkit.javadsl.TestKit;
import org.scalatest.junit.*;
import VowpalWrapper.Actors.*;
import VowpalWrapper.DecisionPoint.*;
import java.util.*;
public class JavaInitializationTest extends JUnitSuite {
public void test() {
List<String> actions = Arrays.asList("action A", "action B", "action C");
DecisionPointDef dp = new DecisionPointDef("some-decision-point", "0.0", actions);
}
}
Initially, I am writing these tests just to validate my Java API for actor classes written (and tested elsewhere) in Scala.
My build.sbt is very straightforward, but I'll include it just in case:
scalaVersion := "2.12.4"
libraryDependencies ++= Seq(
"org.scalactic" %% "scalactic" % "3.0.5",
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.apache.commons" % "commons-math3" % "3.6.1",
"com.typesafe.akka" %% "akka-actor" % "2.5.1",
"com.typesafe.akka" %% "akka-testkit" % "2.5.1" % Test
)
Trying to add import org.junit.Test for the #Test annotation, I am not sure which version of junit to include to match the Akka Testkit stack, or whether that's a really good idea.
Many thanks in advance!!
Adding junit to build.sbt solves it.
For my case:
"junit" % "junit" % "4.12" % Test
which hopefully corresponds the the version of junit that the Akka TestKit version in my build.sbt corresponds to. Someone should really update https://doc.akka.io/docs/akka/2.5/testing.html I guess.
Related
How can I add process parameters using sbt-native-packager configuration? I want to add the options for redirect process stderr to file? To have the result like that:
sudo -u app bash -c "app >>/var/log/app/stderr.log 2>&1"
I use sbt-native-packager 1.2.0-M5 for build deb package with JavaServerAppPackaging, JDebPackaging, SystemdPlugin, UpstartPlugin the exception in logs, only in stderr. Also I must delete app pid manually after crash and if it exists, then I have error in stderr.
My plugins.sbt:
resolvers += Resolver.bintrayRepo("sbt", "sbt-plugin-releases")
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.8-netty-4.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.2.0-M5")
addSbtPlugin("net.virtual-void" % "sbt-dependency-graph" % "0.8.2")
addSbtPlugin("com.lightbend.sbt" % "sbt-javaagent" % "0.1.1")
libraryDependencies += "org.vafer" % "jdeb" % "1.3" artifacts (Artifact("jdeb", "jar", "jar"))
my build.sbt:
...
debianPackageDependencies in Debian ++= Seq("postgresql-9.5 (>= 9.5.1)")
lazy val root = (project in file(".")).enablePlugins(PlayScala, JavaAgent)
scalaVersion := "2.11.8"
val akkaVersion = "2.4.10"
libraryDependencies ++= Seq(
"org.postgresql" % "postgresql" % "9.4.1208",
"org.scalikejdbc" %% "scalikejdbc" % "2.4.0",
"org.scalikejdbc" %% "scalikejdbc-config" % "2.4.0",
"org.scalikejdbc" %% "scalikejdbc-play-initializer" % "2.5.1",
"org.flywaydb" %% "flyway-play" % "3.0.1",
"com.typesafe.akka" %% "akka-contrib" % akkaVersion,
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"io.dropwizard.metrics" % "metrics-core" % "3.1.2",
"io.dropwizard.metrics" % "metrics-jvm" % "3.1.2",
"org.coursera" % "dropwizard-metrics-datadog" % "1.1.4",
"com.typesafe.akka" %% "akka-testkit" % akkaVersion % Test,
"com.relayrides" % "pushy" % "0.8",
"com.relayrides" % "pushy-dropwizard-metrics-listener" % "0.8",
"org.eclipse.jetty.alpn" % "alpn-api" % "1.1.3.v20160715" % "runtime",
ws,
specs2 % Test
)
resolvers += "Typesafe Releases" at "http://repo.typesafe.com/typesafe/maven-releases/"
resolvers += Resolver.mavenLocal
routesGenerator := InjectedRoutesGenerator
javaOptions in Test ++= Seq("-Dlogger.resource=logback-test.xml")
scalacOptions in Universal ++= Seq("-unchecked", "-deprecation", "-notailcalls")
javaOptions in Universal ++= Seq(
"-J-server",
...
)
...
import com.typesafe.sbt.packager.archetypes.systemloader._
// UpstartPlugin for ubuntu 14.04, SystemdPlugin for ubuntu 16.04
enablePlugins(JavaServerAppPackaging, JDebPackaging, SystemdPlugin, UpstartPlugin)
requiredStartFacilities := Some("datadog-agent.service, systemd-journald.service, postgresql.service")
javaAgents += "org.mortbay.jetty.alpn" % "jetty-alpn-agent" % "2.0.4" % "dist"
ps I found a workaround, in ubuntu 16.04 I can use journald to collect all the logs in the system.
Thanks for updating the question with all relevant information. There are a couple of things here.
Only one Systemloader plugin
You enable SystemdPlugin and UpstartPlugin. If it works, it only works by accident. No version of native-packager was designed to support multiple systemloader for a single package type in a single build module.
The solution is to create sub modules with the relevant systemloader enabled.
Logging to stderr
You are right regarding systemd. It provides facilities to capture the log output of your process. If you like you can add your findings to the native-packager documentation ( there is a systemd plugin section ).
The upstart support in native-packager is rather simple. There weren't a lot of requeset as Ubuntu is switching to systemd and you can always fallback to systemv. Which brings me to the solution to your problem.
You can use the SystemVPlugin, which supports a daemon_log_file. The systemv documentation provides you with the necessary details.
cheers,
Muki
I'm using the one-jar plugin to generate a fat jar file. Here is how my Build.scala looks like:
import com.github.retronym.SbtOneJar
import sbt._
import Keys._
object build extends Build {
def standardSettings = Seq(
exportJars := true
) ++ Defaults.defaultSettings
lazy val metricsProducer = Project("metricsProducer",
file("beta"),
settings = standardSettings ++ SbtOneJar.oneJarSettings
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.6.0",
"org.apache.kafka" %% "kafka" % "0.9.0.0"%,
"joda-time" % "joda-time" % "2.7" %,
"io.spray" %% "spray-json" % "1.3.2" %
)
}
When I tried to run this using:
sbt run one-jar
unresolved dependency: org.scala-sbt.plugins#sbt-onejar;0.8: not found
I have the dependency plug in added in the plugins.sbt. Any clues?
Not sure on sbt one-jar if it is still supported. I managed to get this working using the sbt assembly plugin.
https://github.com/sbt/sbt-assembly
I'm using sbt-aspectj plugin with Play Framework 2.1.5.
When I hit refresh, all resources including javascript files are not reloaded automatically -- I need to restart the server in order to get the expected result.
It seems that I'm missing something in the build, but can't really find what it could be and hence the question.
Here's plugins.sbt file:
// Used to weave Logger around controller methods
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.9.4")
Build.scala file:
import com.typesafe.sbt.SbtAspectj.AspectjKeys.inputs
import com.typesafe.sbt.SbtAspectj.{Aspectj, aspectjSettings, compiledClasses}
import play.Project._
import sbt.Keys._
import sbt._
object Build extends Build {
val appName = "frontend"
val appVersion = "1.0-SNAPSHOT"
val frontEndAppDependencies = Seq(
javaCore,
"org.slf4j" % "slf4j-api" % "1.6.6",
"be.objectify" %% "deadbolt-java" % "2.1-RC2",
"com.typesafe.akka" %% "akka-quartz-scheduler" % "1.2.0-akka-2.1.x",
"com.fasterxml.jackson.core" % "jackson-core" % "2.2.0",
"com.fasterxml.jackson.core" % "jackson-databind" % "2.2.0",
"org.apache.directory.studio" % "org.apache.commons.io" % "2.4",
"org.apache.poi" % "poi-ooxml" % "3.9"
)
val main = play.Project(appName, appVersion, frontEndAppDependencies).settings(
resolvers += Resolver.mavenLocal,
lessEntryPoints <<= baseDirectory(_ / "app" / "assets" / "stylesheets" ** "main.less"),
coffeescriptOptions := Seq("bare")
)
// todo : activate aspectj before release to enable log filters ; this configuration is deactivated because of the resources auto reloading bug
.settings(aspectjSettings: _*).settings(inputs in Aspectj <+= compiledClasses,
products in Compile <<= products in Aspectj,
products in Runtime <<= products in Compile
)
}
This is a pretty noob question.
I'm trying to learn about SparkSQL. I've been following the example described here:
http://spark.apache.org/docs/1.0.0/sql-programming-guide.html
Everything works fine in the Spark-shell, but when I try to use sbt to build a batch version, I get the following error message:
object sql is not a member of package org.apache.spark
Unfortunately, I'm rather new to sbt, so I don't know how to correct this problem. I suspect that I need to include additional dependencies, but I can't figure out how.
Here is the code I'm trying to compile:
/* TestApp.scala */
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
case class Record(k: Int, v: String)
object TestApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext._
val data = sc.parallelize(1 to 100000)
val records = data.map(i => new Record(i, "value = "+i))
val table = createSchemaRDD(records, Record)
println(">>> " + table.count)
}
}
The error is flagged on the line where I try to create a SQLContext.
Here is the content of the sbt file:
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
Thanks for the help.
As is often the case, the act of asking the question helped me figure out the answer. The answer is to add the following line in the sbt file.
libraryDependencies += "org.apache.spark" %% "spark-sql" % "1.0.0"
I also realized there is an additional problem in the little program above. There are too many arguments in the call to createSchemaRDD. That line should read as follows:
val table = createSchemaRDD(records)
Thanks! I ran into a similar problem while building a Scala app in Maven. Based on what you did with SBT, I added the corresponding Maven dependencies as follows and now I am able to compile and generate the jar file.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.2.1</version>
</dependency>
I got the similar issue, in my case, i just copy pasted the below sbt setup from online with scalaVersion := "2.10.4" but in my environment, i actually have the scala version 2.11.8
so updated & executed sbt package again, issue fixed
name := "Test Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.0.0"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
I'm trying to create a plugin for sbt 0.12.1 that will generate java files from WSDL, compile them, and then publish the jar.
My project layout is like:
./build.sbt
./project/build.sbt
./project/WsdlBuild.scala
./src/main/wsdl/...many wsdl files...
I'm using axis to generate the java files, and build.sbt looks like:
name := "zxtm-api"
organization := "com.giltgroupe.zeus"
unmanagedBase <<= baseDirectory { base => base / "wsdl-lib" }
libraryDependencies ++= Seq(
"axis" % "axis-wsdl4j" % "1.2.1",
"commons-logging" % "commons-logging" % "1.0.4",
"commons-discovery" % "commons-discovery" % "0.2",
"log4j" % "log4j" % "1.2.8",
"org.apache.axis" % "axis" % "1.4",
"org.apache.axis" % "axis-ant" % "1.4",
"org.apache.axis" % "axis-jaxrpc" % "1.4",
"org.apache.axis" % "axis-saaj" % "1.4"
)
gilt.zxtm.WsdlBuild.wsdlSettings
(There was one jar we couldn't find in any maven repo that's in wsdl-lib)
project/build.sbt is very similar:
libraryDependencies ++= Seq(
"axis" % "axis-wsdl4j" % "1.2.1",
"commons-logging" % "commons-logging" % "1.0.4",
"commons-discovery" % "commons-discovery" % "0.2",
"log4j" % "log4j" % "1.2.8",
"org.apache.axis" % "axis" % "1.4",
"org.apache.axis" % "axis-ant" % "1.4",
"org.apache.axis" % "axis-jaxrpc" % "1.4",
"org.apache.axis" % "axis-saaj" % "1.4"
)
unmanagedBase <<= baseDirectory { base => base / "wsdl-lib" }
So I wrote the code in WsdlBuild.scala to generate the java files, and ended up with something like:
object WsdlBuild extends Plugin {
lazy val wsdlSourceDir = SettingKey[File]("wsdl-source-dir")
lazy val wsdlToJava = TaskKey[Unit]("wsdl-to-java")
lazy val managedSrcDir = file("target/src_managed/wsdl")
val wsdlSettings = inConfig(Compile)(Seq(
compile <<= compile dependsOn wsdlToJava,
javaSource := managedSrcDir,
managedSourceDirectories := Seq(managedSrcDir)
)) ++ Seq(
wsdlToJava <<= (wsdlSourceDir, managedSourceDirectories in Compile, state) map {
(wsdlDir, managedDirs, s) =>
// by convention use the first one. Not obvious why there is
// ever more than one
createJavaFromWsdl(wsdlDir, managedDirs.head, s.log)
},
wsdlSourceDir := file("src/main/wsdl")
)
def createJavaFromWsdl(wsdlDir: File, outputDir: File, log: Logger): File = { ... }
So this sort of works. If I run compile, it generates the wsdl correctly. But if I publish-local, it doesn't compile. So in order to publish or publish-local, and I have to manually compile first.
Any suggestions?
Generating sources and resources is described in this howto of the sbt docs.
In your case, wsdlSettings might look like:
val wsdlSettings = inConfig(Compile)(Seq(
sourceGenerators <+= wsdlToJava,
wsdlSourceDir <<= baseDirectory(_ / "src/main/wsdl"),
wsdlToJava <<= (wsdlSourceDir, sourceManaged, streams) map {
(wsdlDir, managedDir, s) =>
createJavaFromWsdl(wsdlDir, managedDir, s.log)
},
)
Some changes unrelated to your question:
Get the logger from streams. This sends output to a task-specific logger so that you can retrieve it individually. See this howto for more information on this.
Always use absolute paths, often by basing a file on baseDirectory. See Use absolute paths.
The question is quite old, although the problem might be still relevant to someone.
In my case, I approached a very similar problem by having an sh script that does all the dirty work of WSDL generation with wsimport (comes with Java out of the box). A dedicated sbt subproject wraps it as a task and executes on compilation. Such subproject can be easily inserted into any other, bigger sbt setup where you can just add a dependency on it.
Enough talking, here's a template on GitHub that demonstrates exactly that: https://github.com/sainnr/sbt-scala-wsdl-template. Hope it saves someone a good couple of hours messing around with WSDL and build tools. Feel free to fork or improve it if you find it any helpful!