scalapb how to generate code from protobuf files in test directory? - sbt

I'd like to generate code from protobuf files in test directory.
project/test/protobuf/myproto.proto
This doesn't work.
PB.targets in Test := Seq(
scalapb.gen() -> (sourceManaged in Test).value
)
Looks like scalapb only generates files for protos in main/protobuf directory.

You need to enable ScalaPB code generator for your test configuration. Add this to build.sbt:
Project.inConfig(Test)(sbtprotoc.ProtocPlugin.protobufConfigSettings)

Related

Get scrooge to generate source files in test phase?

I have a multi module build that looks kind of like:
lazy val root = (project in file(".")).
settings(common).
aggregate(finagle_core, finagle_thrift)
lazy val finagle_core =
project.
settings(common).
settings(Seq(
name := "finagle-core",
libraryDependencies ++= Dependencies.finagle
))
lazy val finagle_thrift =
project.
settings(common).
settings(Seq(
name := "finagle-thrift",
libraryDependencies ++= Dependencies.finagleThrift,
scroogeThriftSourceFolder in Test <<= baseDirectory {
base => {
base / "target/thrift_external/"
}
},
scroogeThriftDependencies in Test := Seq(
"external-client"
),
scroogeBuildOptions in Test := Seq(
WithFinagle
)
)).dependsOn(finagle_core)
Where finagle_thrift has a dependency on a jar file external-client that has thrift files in it. I want it to extract the thrift files to target/thrift_external and compile the thrift files into a client.
This does work, however I have to execute sbt twice to get it to work. The first time I run sbt, it doesn't extract the files. The second time it does. I am at a loss as to why that is happening.
==
EDIT:
I see whats happening. It does unpack the dependencies on test, however because the settings are evaluated before the unpack, the generated code doesn't get the list of files that are generated. The second time it runs, its already extracted so it picks up the thrift files
==
EDIT 2:
I solved this in a super janky way:
addCommandAlias("build", "; test:scroogeUnpackDeps; compile")
And now it gets unpacked first, then compiled
SBT resolves the scroogeThriftSourceFolder directory when it loads (before running the tasks) at which point the external files are not there yet.
Performing a reload will make it discover the downloaded files:
sbt scroogeUnpackDeps reload compile

SBT protobuf grpc configuration

I'm new to SBT, and i'm trying to convert gradle protobuf/grpc configuration to SBT.
I wonder if the scala community had done this before me?
I've tried this plugin https://github.com/sbt/sbt-protobuf, but it does not provide any configuration to enable grpc compilation...
Any help appreciated.
You can use ScalaPB to generate the gRPC stubs for Scala. First, add the plugin to your project/plugins.sbt:
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.1")
libraryDependencies += "com.trueaccord.scalapb" %% "compilerplugin" % "0.5.43"
Then, add this to your build.sbt:
libraryDependencies ++= Seq(
"io.grpc" % "grpc-netty" % "1.0.1",
"io.grpc" % "grpc-stub" % "1.0.1",
"io.grpc" % "grpc-auth" % "1.0.1",
"com.trueaccord.scalapb" %% "scalapb-runtime-grpc" % "0.5.43",
"io.netty" % "netty-tcnative-boringssl-static" % "1.1.33.Fork19", // SSL support
"javassist" % "javassist" % "3.12.1.GA" // Improves Netty performance
)
PB.targets in Compile := Seq(
scalapb.gen(grpc = true, flatPackage = true) -> (sourceManaged in Compile).value
)
Now you can put your .proto files in src/main/protobuf and they will be picked up by ScalaPB.
I have an example Scala gRPC project here. It shows how to configure mutual TLS authentication, user sessions using JSON Web Tokens, a JSON gateway via grpc-gateway, and deployment to Kubernetes via Helm.
I actually faced a couple of problems myself trying to migrate from Gradle to SBT.
Like you said, sbt-protobuf plugin doesn't have any grpc specific settings, yet it's possible, here are couple of settings you should double check:
Set the path and version of your protoc:
version in PB.protobufConfig := "3.0.0"
protoc in PB.protobufConfig := PATH_PROTOC
If needed set the location of your .proto files (default is src/main/protobuf):
sourceDirectory in PB.protobufConfig := baseDirectory.value / "src" / "main" / "proto"
Finally, like Eric Anderson said, set extra options of protoc used by grpc-java. First options sets the path for your protoc-gen-grpc-java plugin bin; and second sets the output path of grpc-java to the same as sbt-protobuf:
protocOptions in PB.protobufConfig ++= Seq(
"--plugin=protoc-gen-grpc-java=" + PATH_GRPC_JAVA_PLUGIN,
"--grpc-java_out=" + baseDirectory.value + "/target/src_managed/main/compiled_protobuf")
I ended up putting a repository with all of this sorted out. Here it is, hope it helps!
I'm not familiar with sbt, but it seems sbt-protobuf does not natively support protoc plugins or using the prebuilt protoc or protoc-gen-grpc-java binaries. You will need to pass the necessary flags manually.
Something like this (untested):
protocOptions in PB.protobufConfig ++= Seq(
"--plugin=protoc-gen-grpc-java=path/to/protoc-gen-grpc-java", "--grpc-java_out=path/to/output/folder")
You would need to change the "path/to" parts to fit your system.

Is it possible to skip code generation for included thrift files in Scrooge?

The Scrooge SBT plugin has the option to include Thrift IDL files from library dependencies (jar files). Often these jar files already contain the generated sources. If I include a Thrift IDL, I don't want to generate these sources again. Otherwise they will be duplicated.
shared.thift
namespace java me.shared
struct Foo {
1: string id
}
shared.jar
me
shared
Foo.scala
shared.thrift
So when my project depends on shared.jar and I include shared.thrift in another Thrift IDL file, I don't want Scrooge to generate Foo.scala again. What's the most straight-forward way to archive this?
It was actually straight-forward.
scroogeThriftSources in Compile ~= { sources: Seq[File] =>
sources filter { case file =>
!file.getName.contains("shared.thrift")
}
}

Adding /etc/<application> to the classpath in sbt-native-packager for debian:package-bin

So I'm using the packageArchetype.java_server and setup my mappings so the files from "src/main/resources" go into my "/etc/" folder in the debian package. I'm using "sbt debian:package-bin" to create the package
The trouble is when I use "sbt run" it picks up the src/main/resources from the classpath. What's the right way to get the sbt-native-packager to give /etc/ as a resource classpath for my configuration and logging files?
plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-M2")
build.sbt
...
packageArchetype.java_server
packageDescription := "Some Description"
packageSummary := "My App Daemon"
maintainer := "Me<me#example.org>"
mappings in Universal ++= Seq(
file("src/main/resources/application.conf") -> "conf/application.conf",
file("src/main/resources/logback.xml") -> "conf/logback.xml"
)
....
I took a slightly different approach. Since sbt-native-packager keeps those two files (application.conf and logback.xml) in my package distribution jar file, I really just wanted a way to overwrite (or merge) these files from /etc. I kept the two mappings above and just added the following:
src/main/templates/etc-default:
-Dmyapplication.config=/etc/${{app_name}}/application.conf
-Dlogback.configurationFile=/etc/${{app_name}}/logback.xml
Then within my code (using Typesafe Config Libraries):
lazy val baseConfig = ConfigFactory.load //defaults from src/resources
//For use in Debain packaging script. (see etc-default)
val systemConfig = Option(System.getProperty("myapplication.config")) match {
case Some(cfile) => ConfigFactory.parseFile(new File(cfile)).withFallback(baseConfig)
case None => baseConfig
}
And of course -Dlogback.configuration is a system propety used by Logback.

How to create a basic project setup using sbt-native-packager

I have a project setup working with SBT to the point of creating sub-project artifacts.
I have been searching for a way to create a JAR file that contains sub-project JAR files along with some meta information. Based on suggestions, I looked at sbt-native-packager and it seems to have the capabilities I need.
I am wondering if someone would be willing to help me along this path by providing tips on creating a skeleton package specification for the plugin.
I think my configuration is pretty simple.
What I want to end up with is a JAR file with the following contents:
/manifest.xml
module.xml
modules/sub-project-one.jar
sub-project-two.jar
sub-project-three.jar
Both the manifest.xml and module.xml files will be generated from project information. The name of the resulting JAR file will be based on the name of the root project, it's version and a suffix "nkp.jar" (e.g. overlay-1.0.1.nkp.jar).
Thanks in advance for any help getting me going with this.
-- Randy
Here's the basics of what you want:
val createManifestXml = taskKey[File]("Creates the packaged maniest.xml")
val createModuleXml = taskKey[File]("Creates the module.xml file.")
// TODO - Define createManifestXml + createModuleXml
mappings in Universal ++= {
Map(createManifestXml.value -> "manifest.xml"
module.xml -> "module.xml")
}
mappings in Universal ++= {
val moduleJars =
Seq((packageBin in subProjectOne).value,
(packageBin in subProjectTwo).value,
(packageBin in subProjectThree).value)
moduleJars map { jar =>
jar -> s"module/${jar.getName}"
}
}
This will insure that the tgz/txz/zip can be generated. You can then either use the "generic" universal -> msi and universal -> rpm/deb mappings or create that mapping by hand if you desire.
Hope that helps!

Resources