I'm new to SBT, and i'm trying to convert gradle protobuf/grpc configuration to SBT.
I wonder if the scala community had done this before me?
I've tried this plugin https://github.com/sbt/sbt-protobuf, but it does not provide any configuration to enable grpc compilation...
Any help appreciated.
You can use ScalaPB to generate the gRPC stubs for Scala. First, add the plugin to your project/plugins.sbt:
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.1")
libraryDependencies += "com.trueaccord.scalapb" %% "compilerplugin" % "0.5.43"
Then, add this to your build.sbt:
libraryDependencies ++= Seq(
"io.grpc" % "grpc-netty" % "1.0.1",
"io.grpc" % "grpc-stub" % "1.0.1",
"io.grpc" % "grpc-auth" % "1.0.1",
"com.trueaccord.scalapb" %% "scalapb-runtime-grpc" % "0.5.43",
"io.netty" % "netty-tcnative-boringssl-static" % "1.1.33.Fork19", // SSL support
"javassist" % "javassist" % "3.12.1.GA" // Improves Netty performance
)
PB.targets in Compile := Seq(
scalapb.gen(grpc = true, flatPackage = true) -> (sourceManaged in Compile).value
)
Now you can put your .proto files in src/main/protobuf and they will be picked up by ScalaPB.
I have an example Scala gRPC project here. It shows how to configure mutual TLS authentication, user sessions using JSON Web Tokens, a JSON gateway via grpc-gateway, and deployment to Kubernetes via Helm.
I actually faced a couple of problems myself trying to migrate from Gradle to SBT.
Like you said, sbt-protobuf plugin doesn't have any grpc specific settings, yet it's possible, here are couple of settings you should double check:
Set the path and version of your protoc:
version in PB.protobufConfig := "3.0.0"
protoc in PB.protobufConfig := PATH_PROTOC
If needed set the location of your .proto files (default is src/main/protobuf):
sourceDirectory in PB.protobufConfig := baseDirectory.value / "src" / "main" / "proto"
Finally, like Eric Anderson said, set extra options of protoc used by grpc-java. First options sets the path for your protoc-gen-grpc-java plugin bin; and second sets the output path of grpc-java to the same as sbt-protobuf:
protocOptions in PB.protobufConfig ++= Seq(
"--plugin=protoc-gen-grpc-java=" + PATH_GRPC_JAVA_PLUGIN,
"--grpc-java_out=" + baseDirectory.value + "/target/src_managed/main/compiled_protobuf")
I ended up putting a repository with all of this sorted out. Here it is, hope it helps!
I'm not familiar with sbt, but it seems sbt-protobuf does not natively support protoc plugins or using the prebuilt protoc or protoc-gen-grpc-java binaries. You will need to pass the necessary flags manually.
Something like this (untested):
protocOptions in PB.protobufConfig ++= Seq(
"--plugin=protoc-gen-grpc-java=path/to/protoc-gen-grpc-java", "--grpc-java_out=path/to/output/folder")
You would need to change the "path/to" parts to fit your system.
Related
I'm attempting to use the sbt-aspectj plugin with the sbt native packager and am running into an issue where the associated -javaagent path to the aspectj load time weaver jar references an ivy cache location rather than something packaged.
That is, after running sbt stage, executing the staged application via bash -x target/universal/stage/bin/myapp/ results in this javaagent:
exec java -javaagent:/home/myuser/.ivy2/cache/org.aspectj/aspectjweaver/jars/aspectjweaver-1.8.10.jar -cp /home/myuser/myproject/target/universal/stage/lib/org.aspectj.aspectjweaver-1.8.10.jar:/home/myuser/myproject/target/universal/stage/lib/otherlibs.jar myorg.MyMainApp args
My target platform is Heroku where the artifacts are built before being effectively 'pushed' out to individual 'dynos' (very analogous to a docker setup). The issue here is that the resulting -javaagent path was valid on the machine in which the 'staged' deployable was built, but will not exist where it's ultimately run.
How can one configure the sbt-aspectj plugin to reference a packaged lib rather than one from the ivy cache?
Current configuration:
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.10.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.1.5")
build.sbt (selected parts):
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
javaOptions in Runtime ++= { AspectjKeys.weaverOptions in Aspectj }.value,
// see: https://github.com/sbt/sbt-native-packager/issues/598#issuecomment-111584866
javaOptions in Universal ++= { AspectjKeys.weaverOptions in Aspectj }.value
.map { "-J" + _ },
fork in run := true
)
Update
I've tried several approaches including pulling the relevant output for javaOptions from existing mappings, but the result is a cyclical dependency error thrown by sbt.
I have something that technically solves my problem but feels unsatisfactory. As of now, I'm including an aspectjweaver dependency directly and using the sbt-native-packager concept of bashScriptExtraDefines to append an appropriate javaagent:
updated build.sbt:
import com.typesafe.sbt.SbtAspectj._
lazy val root = (project in file(".")).settings(
aspectjSettings,
bashScriptExtraDefines += scriptClasspath.value
.filter(_.contains("aspectjweaver"))
.headOption
.map("addJava -javaagent:${lib_dir}/" + _)
.getOrElse(""),
fork in run := true
)
You can add the following settings in your sbt config:
.settings(
retrieveManaged := true,
libraryDependencies += "org.aspectj" % "aspectjweaver" % aspectJWeaverV)
AspectJ weaver JAR will be copied to ./lib_managed/jars/org.aspectj/aspectjweaver/aspectjweaver-[aspectJWeaverV].jar in your project root.
I actually solved this by using the sbt-javaagent plugin to adding agents to the runtime
I've used sbt-eclipse in the past to successfully import a simple sbt project into eclipse. I'm now trying to leverage the CrossProject mechanism of sbt to use the Scala-JS environment (makes 2 subprojects in sbt--one for Javascript and one for JVM code). The recommendation (see SBT docs link here) is to add the setting 'EclipseKeys.useProjectId := true' in the build.sbt file to support importing (now) 2 projects into one eclipse project. I then give the 'eclipse' command in a running SBT session to create my eclipse project and then launch eclipse and attempt to import this new project. When I do this, the import dialog wizard in eclipse does show me two sub-projects, but when I try to finish the import, eclipse complains that the project already exists and I get two strange looking links in my eclipse project that seem to do nothing.
What is the correct procedure for getting a CrossProject sbt build into eclipse?
Ok, so it seems eclipse did not like that I had only one 'name' for the project that was in the shared settings area of the build.sbt I had this:
lazy val sp = crossProject.in(file(".")).
settings(
version := "0.1",
name := "SJSTut",
scalaVersion := "2.11.7"
).
jvmSettings(
// Add JVM-specific settings here
libraryDependencies ++= Seq(...)
).
jsSettings(
// Add JS-specific settings here
libraryDependencies ++= Seq(...)
)
and what I should have done was this:
lazy val sp = crossProject.in(file(".")).
settings(
version := "0.1",
scalaVersion := "2.11.7"
).
jvmSettings(
// Add JVM-specific settings here
name := "SJSTutJVM",
libraryDependencies ++= Seq(...)
).
jsSettings(
// Add JS-specific settings here
name := "SJSTutJS",
libraryDependencies ++= Seq(...)
)
Note the removal of the 'name' assignment from settings and instead, placements into both the jvmSettings and jsSettings area with uniquely different names.
Now I'm able to pull this into eclipse (as 2 separate projects). If anyone else has a better setup, I'd love to hear about it.
Environment: Play framework; activator-1.3.2; Play-Java Web Application
build.sbt -
name := """ProjectDemoNew"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.1"
resolvers +="Local Maven Repository" at "file:///home/shiva/.m2/repository"
libraryDependencies ++= Seq(
javaJdbc,
javaEbean,
cache,
javaWs,
"org.springframework" % "spring-context" % "3.2.3.RELEASE",
"org.springframework" % "spring-aop" % "3.2.3.RELEASE",
"org.springframework" % "spring-expression" % "3.2.3.RELEASE",
"org.springframework" % "spring-test" % "3.2.3.RELEASE",
"com.mycomp.config"%"platform-config"%"0.0.1-SNAPSHOT"
)
$ activator run
gives the following error(s) when the internet is down..
--
--
[info] You probably access the destination server through a proxy server that is not well configured.
[warn] Host repo.typesafe.com not found. url=https://repo.typesafe.com/typesafe/releases/com/mycomp/conf/i/platform-config/0.0.1-SNAPSHOT/...-SNAPSHOT.pom
--
--
I am not seeing any errors in case the internet is up.
There are lots of posts, but the answers seem to vary a lot.
All jars ( spring, application-specific, third party..) are in my local repository. But it always connects internet for refreshing dependencies, build is slow when the internet speed is not good
How to make Play go through Local repository without going through internet/offline? This helps me doing the build quickly with no or weak internet connectivity.
It automatically checks the internet for SNAPSHOT dependencies.
If you don't want it to do it, add :
offline := true
to your build.sbt file.
So I'm using the packageArchetype.java_server and setup my mappings so the files from "src/main/resources" go into my "/etc/" folder in the debian package. I'm using "sbt debian:package-bin" to create the package
The trouble is when I use "sbt run" it picks up the src/main/resources from the classpath. What's the right way to get the sbt-native-packager to give /etc/ as a resource classpath for my configuration and logging files?
plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "0.7.0-M2")
build.sbt
...
packageArchetype.java_server
packageDescription := "Some Description"
packageSummary := "My App Daemon"
maintainer := "Me<me#example.org>"
mappings in Universal ++= Seq(
file("src/main/resources/application.conf") -> "conf/application.conf",
file("src/main/resources/logback.xml") -> "conf/logback.xml"
)
....
I took a slightly different approach. Since sbt-native-packager keeps those two files (application.conf and logback.xml) in my package distribution jar file, I really just wanted a way to overwrite (or merge) these files from /etc. I kept the two mappings above and just added the following:
src/main/templates/etc-default:
-Dmyapplication.config=/etc/${{app_name}}/application.conf
-Dlogback.configurationFile=/etc/${{app_name}}/logback.xml
Then within my code (using Typesafe Config Libraries):
lazy val baseConfig = ConfigFactory.load //defaults from src/resources
//For use in Debain packaging script. (see etc-default)
val systemConfig = Option(System.getProperty("myapplication.config")) match {
case Some(cfile) => ConfigFactory.parseFile(new File(cfile)).withFallback(baseConfig)
case None => baseConfig
}
And of course -Dlogback.configuration is a system propety used by Logback.
I have the following project build:
import sbt._
import Keys._
object ProjectBuild extends Build {
val buildVersion = "0.1-SNAPSHOT"
val delvingReleases = "Delving Releases Repository" at "http://development.delving.org:8081/nexus/content/repositories/releases"
val delvingSnapshots = "Delving Snapshot Repository" at "http://development.delving.org:8081/nexus/content/repositories/snapshots"
val delvingRepository = if (buildVersion.endsWith("SNAPSHOT")) delvingSnapshots else delvingReleases
lazy val root = Project(
id = "basex-scala-client",
base = file(".")
).settings(
organization := "eu.delving",
version := buildVersion,
resolvers += "BaseX Repository" at "http://files.basex.org/maven",
libraryDependencies += "org.basex" % "basex" % "7.2.1",
libraryDependencies += "org.specs2" %% "specs2" % "1.7.1" % "test",
publishTo := Some(delvingRepository),
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials"),
publishMavenStyle := true
)
}
When I include the resulting library in another project, like so:
"eu.delving" %% "basex-scala-client" % "0.1-SNAPSHOT"
and I try to build that project, I get an error prompting me that the "org.basex % basex % 7.2.1" library referenced by that project cannot be found.
I have to go and add the resolver in the "client" project in order for the library to be found. Is there a way to avoid this?
There's no transitive resolvers, so the build user needs to know all the resolvers of all the transitive library dependencies. The benefit of this approach is that on the opensource projects, it encourages projects to publish to one of the known repositories connected to the known resolvers.
For corporate usage, you can prevent your traffic from going to unknown places introduced by some dependencies down the graph.
To share resolver settings within the organization, you can create an org-wide plugin.
Has this situation changed in the last 7 years, 10 months? I have a transitive library dependency at a custom repository. For its immediate client, I specify a resolver and the repository is written to the client's pom file when published. The client's client does not seem to use that information to find the transitive library. I have to "add the resolvers upstream by hand".