Is there a way to use Kamon logging only in running the code, not in tests?
I'm using kamon-log-reporter and kamon-scala on Scala 2.12. My code-under-test uses the Kamon API so I want to have kamon-core both in Runtime and Test. However, I'm not interested in console logging in tests, and sbt-aspectj-runner plugin doesn't seem to launch AspectJ for sbt test (a separate issue).
My setup:
/project/plugins.sbt
resolvers += Resolver.bintrayIvyRepo("kamon-io", "sbt-plugins")
addSbtPlugin("io.kamon" % "sbt-aspectj-runner" % "1.0.1")
build.sbt
val kamonVer= "0.6.5"
val kamon = "io.kamon" %% "kamon-core" % kamonVer
val kamonLogging = "io.kamon" %% "kamon-log-reporter" % kamonVer
val kamonAspectJ = "io.kamon" %% "kamon-scala" % kamonVer
libraryDependencies ++= Seq(
kamon, akkaHttp, typesafeConfig, akkaHttpTestkit, scalaTest)
libraryDependencies ++= Seq(kamonLogging, kamonAspectJ)
I've tried this, but it makes logging disappear also in sbt run:
libraryDependencies in Runtime ++= Seq(kamonLogging, kamonAspectJ)
I would recommend trying
addSbtPlugin("com.lightbend.sbt" % "sbt-javaagent" % "0.1.2")
In https://github.com/Workday/prometheus-akka/blob/master/build.sbt, I use this to enable aspectjweaver in tests.
Related
I'm trying to weave the testing library scalatest (https://mvnrepository.com/artifact/org.scalatest/scalatest_2.12/3.2.0-SNAP10). This library dependency is in my build.sbt:
enablePlugins(SbtAspectj)
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.5"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test"
aspectjInputs in Aspectj ++= update.value.matching(moduleFilter(organization = "com.typesafe.akka", name = "akka-actor*"))
aspectjInputs in Aspectj ++= update.value.matching(moduleFilter(organization = "org.scalatest", name = "scalatest*"))
fullClasspath in Runtime := aspectjUseInstrumentedClasses(Runtime).value ++ aspectjUseInstrumentedClasses(Test).value
Looking at the maven website it lists several other optional dependencies such as org.jmock, etc.
The problem is that SBT only downloads scalatest.jar and not jmock.jar (along the other optional dependencies). Printing out the aspectjInputs indeed does show scalatest.jar, but not jmock.jar.
Because of that reason (?), it gives me the following errors:
[error] error at (no source information available)
[error] /Users/jonas/.ivy2/cache/org.scalatest/scalatest_2.12/bundles/scalatest_2.12-3.0.5.jar:0::0 can't determine superclass of missing type org.jmock.Expectations
[error] when weaving type org.scalatest.jmock.JMockExpectations
[error] when weaving classes
[error] when weaving
[error] when batch building BuildConfig[null] #Files=1 AopXmls=#0
[error] [Xlint:cantFindType]
I'm assuming that I somehow need the .jar file of the optional dependencies of ScalaTest, but as they are not downloaded by sbt, I'm lost on how to solve this.
So, how can I resolve them or add them to the classpath when the weaving happens?
I'm using the aspectj-sbt plugin.
You can include the optional dependencies with a configuration mapping like this:
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % "test->compile,optional"
I'm attempting to use the sbt-aspectj plugin with the sbt native packager and am running into an issue where the associated -javaagent path to the aspectj load time weaver jar references an ivy cache location rather than something packaged.
That is, after running sbt stage, executing the staged application via bash -x target/universal/stage/bin/myapp/ results in this javaagent:
exec java -javaagent:/home/myuser/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.10.jar -cp /home/myuser/myproject/target/universal/stage/lib/org.aspectj.aspectjweaver-1.8.10.jar:/home/myuser/myproject/target/universal/stage/lib/otherlibs.jar myorg.MyMainApp args
My target platform is Heroku where the artifacts are built before being effectively 'pushed' out to individual 'dynos' (very analogous to a docker setup). The issue here is that the resulting -javaagent path was valid on the machine in which the 'staged' deployable was built, but will not exist where it's ultimately run.
How can one configure the sbt-aspectj plugin to reference a packaged lib rather than one from the ivy cache?
Current configuration:
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.10.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.5")
build.sbt (selected parts):
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
javaOptions in Runtime ++= { AspectjKeys.weaverOptions in Aspectj }.value,
// see: https://github.com/sbt/sbt-native-packager/issues/598#issuecomment-111584866
javaOptions in Universal ++= { AspectjKeys.weaverOptions in Aspectj }.value
.map { "-J" + _ },
fork in run := true
)
Update
I've tried several approaches including pulling the relevant output for javaOptions from existing mappings, but the result is a cyclical dependency error thrown by sbt.
I have something that technically solves my problem but feels unsatisfactory. As of now, I'm including an aspectjweaver dependency directly and using the sbt-native-packager concept of bashScriptExtraDefines to append an appropriate javaagent:
updated build.sbt:
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
bashScriptExtraDefines += scriptClasspath.value
.filter(_.contains("aspectjweaver"))
.headOption
.map("addJava -javaagent:${lib_dir}/" + _)
.getOrElse(""),
fork in run := true
)
You can add the following settings in your sbt config:
.settings(
retrieveManaged := true,
libraryDependencies += "org.aspectj" % "aspectjweaver" % aspectJWeaverV)
AspectJ weaver JAR will be copied to ./lib_managed/jars/org.aspectj/aspectjweaver/aspectjweaver-[aspectJWeaverV].jar in your project root.
I actually solved this by using the sbt-javaagent plugin to adding agents to the runtime
I'm new to SBT, and i'm trying to convert gradle protobuf/grpc configuration to SBT.
I wonder if the scala community had done this before me?
I've tried this plugin https://github.com/sbt/sbt-protobuf, but it does not provide any configuration to enable grpc compilation...
Any help appreciated.
You can use ScalaPB to generate the gRPC stubs for Scala. First, add the plugin to your project/plugins.sbt:
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.1")
libraryDependencies += "com.trueaccord.scalapb" %% "compilerplugin" % "0.5.43"
Then, add this to your build.sbt:
libraryDependencies ++= Seq(
"io.grpc" % "grpc-netty" % "1.0.1",
"io.grpc" % "grpc-stub" % "1.0.1",
"io.grpc" % "grpc-auth" % "1.0.1",
"com.trueaccord.scalapb" %% "scalapb-runtime-grpc" % "0.5.43",
"io.netty" % "netty-tcnative-boringssl-static" % "1.1.33.Fork19", // SSL support
"javassist" % "javassist" % "3.12.1.GA" // Improves Netty performance
)
PB.targets in Compile := Seq(
scalapb.gen(grpc = true, flatPackage = true) -> (sourceManaged in Compile).value
)
Now you can put your .proto files in src/main/protobuf and they will be picked up by ScalaPB.
I have an example Scala gRPC project here. It shows how to configure mutual TLS authentication, user sessions using JSON Web Tokens, a JSON gateway via grpc-gateway, and deployment to Kubernetes via Helm.
I actually faced a couple of problems myself trying to migrate from Gradle to SBT.
Like you said, sbt-protobuf plugin doesn't have any grpc specific settings, yet it's possible, here are couple of settings you should double check:
Set the path and version of your protoc:
version in PB.protobufConfig := "3.0.0"
protoc in PB.protobufConfig := PATH_PROTOC
If needed set the location of your .proto files (default is src/main/protobuf):
sourceDirectory in PB.protobufConfig := baseDirectory.value / "src" / "main" / "proto"
Finally, like Eric Anderson said, set extra options of protoc used by grpc-java. First options sets the path for your protoc-gen-grpc-java plugin bin; and second sets the output path of grpc-java to the same as sbt-protobuf:
protocOptions in PB.protobufConfig ++= Seq(
"--plugin=protoc-gen-grpc-java=" + PATH_GRPC_JAVA_PLUGIN,
"--grpc-java_out=" + baseDirectory.value + "/target/src_managed/main/compiled_protobuf")
I ended up putting a repository with all of this sorted out. Here it is, hope it helps!
I'm not familiar with sbt, but it seems sbt-protobuf does not natively support protoc plugins or using the prebuilt protoc or protoc-gen-grpc-java binaries. You will need to pass the necessary flags manually.
Something like this (untested):
protocOptions in PB.protobufConfig ++= Seq(
"--plugin=protoc-gen-grpc-java=path/to/protoc-gen-grpc-java", "--grpc-java_out=path/to/output/folder")
You would need to change the "path/to" parts to fit your system.
My Play! -java project is using another play project (module) as a dependency.
After moving from Play 2.2. to Play 2.3 assets from the sub project are not seen.
In build.sbt I added jar with assets to dependencies
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"com.company" % "project-sub-module_2.11" % "2.3.3"
"com.company" % "project-sub-module_2.11" % "2.3.3" artifacts(Artifact("project-sub-module_2.11","asset", "jar", "assets"))
)
I can see this jar in dependencies. But its contents seems do not appear in public directory then launching "run" command.
I think I need to add something like
packagedArtifacts in publish:= {
val artifacts: Map[sbt.Artifact, java.io.File] = (packagedArtifacts in publish).value
val assets: java.io.File = (playPackageAssets in Compile).value
artifacts + (Artifact("project-sub-module_2.11", "asset", "jar", "assets") -> assets)
}
but for the compilation process.
Thanks in advance!
#mount_ash is right about using
"com.company" % "project-sub-module_2.11" % "2.3.3" classifier "assets"
in your build.sbt to import assets into your project.
On the other hand to publish your assets when compiling, you need to add
the following to the build.sbt of your modules.
packagedArtifacts += ((artifact in playPackageAssets).value -> playPackageAssets.value)
By following these two things I was able to use/discover the assets of my module X in my application host Y.
To solve the above problem the code should be like that
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"com.company" % "project-sub-module_2.11" % "2.3.3"
"com.company" % "project-sub-module_2.11" % "2.3.3" classifier "assets"
)
Here's an example build.sbt:
import AssemblyKeys._
assemblySettings
buildInfoSettings
net.virtualvoid.sbt.graph.Plugin.graphSettings
name := "scala-app-template"
version := "0.1"
scalaVersion := "2.9.3"
val FunnyRuntime = config("funnyruntime") extend(Compile)
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "provided"
sourceGenerators in Compile <+= buildInfo
buildInfoPackage := "com.psnively"
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, target)
assembleArtifact in packageScala := false
val root = project.in(file(".")).
configs(FunnyRuntime).
settings(inConfig(FunnyRuntime)(Classpaths.configSettings ++ baseAssemblySettings ++ Seq(
libraryDependencies += "org.spark-project" %% "spark-core" % "0.7.3" % "funnyruntime"
)): _*)
The goal is to have spark-core "provided" so it and its dependencies are not included in the assembly artifact, but to reinclude them on the runtime classpath for the run- and test-related tasks.
It seems that using a custom scope will ultimately be helpful, but I'm stymied on how to actually cause the default/global run/test tasks to use the custom libraryDependencies and hopefully override the default. I've tried things including:
(run in Global) := (run in FunnyRuntime)
and the like to no avail.
To summarize: this feels essentially a generalization of the web case, where the servlet-api is in "provided" scope, and run/test tasks generally fork a servlet container that really does provide the servlet-api to the running code. The only difference here is that I'm not forking off a separate JVM/environment; I just want to manually augment those tasks' classpaths, effectively "undoing" the "provided" scope, but in a way that continues to exclude the dependency from the assembly artifact.
For a similar case I used in assembly.sbt:
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
and now the 'run' task uses all the libraries, including the ones marked with "provided". No further change was necessary.
Update:
#rob solution seems to be the only one working on latest SBT version, just add to settings in build.sbt:
run in Compile := Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run)).evaluated,
runMain in Compile := Defaults.runMainTask(fullClasspath in Compile, runner in(Compile, run)).evaluated
Adding to #douglaz' answer,
runMain in Compile <<= Defaults.runMainTask(fullClasspath in Compile, runner in (Compile, run))
is the corresponding fix for the runMain task.
Another option is to create separate sbt projects for assembly vs run/test. This allows you to run sbt assemblyProj/assembly to build a fat jar for deploying with spark-submit, as well as sbt runTestProj/run for running directly via sbt with Spark embedded. As added benefits, runTestProj will work without modification in IntelliJ, and a separate main class can be defined for each project in order to e.g. specify the spark master in code when running with sbt.
val sparkDep = "org.apache.spark" %% "spark-core" % sparkVersion
val commonSettings = Seq(
name := "Project",
libraryDependencies ++= Seq(...) // Common deps
)
// Project for running via spark-submit
lazy val assemblyProj = (project in file("proj-dir"))
.settings(
commonSettings,
assembly / mainClass := Some("com.example.Main"),
libraryDependencies += sparkDep % "provided"
)
// Project for running via sbt with embedded spark
lazy val runTestProj = (project in file("proj-dir"))
.settings(
// Projects' target dirs can't overlap
target := target.value.toPath.resolveSibling("target-runtest").toFile,
commonSettings,
// If separate main file needed, e.g. for specifying spark master in code
Compile / run / mainClass := Some("com.example.RunMain"),
libraryDependencies += sparkDep
)
If you use sbt-revolver plugin, here is a solution for its "reStart" task:
fullClasspath in Revolver.reStart <<= fullClasspath in Compile
UPD: for sbt-1.0 you may use the new assignment form:
fullClasspath in Revolver.reStart := (fullClasspath in Compile).value