Different scalac options for different scopes or tasks? - sbt

I am trying to use a compiler plugin with sbt (I'm on 0.13.5), passed along in my build.sbt as:
autoCompilerPlugins := true
scalacOptions += "-Xplugin:myCompilerPluginJar.jar"
This works, the plugin runs, however I would really like to only run the plugin on some explicit compiles (perhaps with a scoped compile task or a custom task).
If I try something like:
val PluginConfig = config("plugin-config") extend(Compile)
autoCompilerPlugins := true
scalacOptions in PluginConfig += "-Xplugin:myCompilerPluginJar.jar"
The plugin does not run on "plugin-config:compile". In fact if I have
scalacOptions in Compile += "-Xplugin:myCompilerPluginJar.jar"
The plugin still runs on "test:compile" or compile on any other scope. I would guess I am probably not understanding something correctly with the configs/scopes.
I also tried:
lazy val pluginCommand = Command.command("plugincompile") { state =>
runTask(compile in Compile,
append(Seq(scalacOptions in Compile += "Xplugin:myCompilerPluginJar.jar"), state)
)
state
}
commands += pluginCommand
But the plugin doesn't actually run on that command, so again I am probably not understanding something there.
Any and all help welcome.

So I have come to hacky solution; I thought I would share it here in case anyone else stumbles upon this question.
val safeCompile = TaskKey[Unit]("safeCompile", "Compiles, catching errors.")
safeCompile := (compile in Compile).result.value.toEither.fold(
l => {
println("Compilation failed.")
}, r => {
println("Compilation success. " + r)})
//Hack to allow "-deprecation" and "-unchecked" in scalacOptions by default
scalacOptions <<= scalacOptions map { current: Seq[String] =>
val default = "-deprecation" :: "-unchecked" :: Nil
if (current.contains("-Xplugin:myCompilerPluginJar.jar")) current else default
}
addCommandAlias("depcheck", "; set scalacOptions := Seq(\"-deprecation\", \"-unchecked\", \"-Xplugin:myCompilerPluginJar.jar\"); safeCompile; set scalacOptions := Seq(\"-deprecation\", \"-unchecked\")")
As a quick guide, this code:
Defines a custom task "safeCompile" that runs the "Compile:compile" task, but succeeds even on errors (this is needed so that the sequence of commands defined later on doesn't break on compilation failure).
Declares "scalacOptions" to be dependent on a function that checks if the plugin is turned on (leaving the options untouched if it is) and otherwise sets the options to the default I want for the project (Seq("-deprecation", "-unchecked")). This is a hack so that these settings are on by default and so that a bare "scalacOptions :=" definition doesn't override the settings done in the aliased command sequence. (Using Seq.append and Seq.distinct might be a nicer way to do this hacky part).
Defines an aliased command sequence that: turns the plugin on, safeCompiles, turns the plugin off.
Comments are welcome, and if you get something cleaner to work, please share!

Related

How can I make a task depend on another task?

I'm new to sbt and I try to create a script for either deploy my application or to deploy and run the application.
What already works for me is
sbt deploy
which will successfully deploy the final .jar file to the remove location.
However, I don't know how to make deployAndRunTask dependent on deployTask. I've tried several things but none of them worked so far.
My last hope was
deployAndRunTask := {
val d = deployTask.value
}
However, this does not seem to work.
This is the script that I'm currently at but sbt deploy-run will only execute the deployAndRunTask task but not the deyployTask.
// DEPLOYMENT
val deployTask = TaskKey[Unit]("deploy", "Copies assembly jar to remote location")
deployTask <<= assembly map { (asm) =>
val account = "user#example.com"
val local = asm.getPath
val remote = account + ":" + "/home/user/" + asm.getName
println(s"Copying: $local -> $account:$remote")
Seq("scp", local, remote) !!
}
val deployAndRunTask = TaskKey[Unit]("deploy-run", "Deploy and run application.")
deployAndRunTask := {
val d = deployTask.value
}
deployAndRunTask <<= assembly map { (asm) =>
println(s"Running the script ..")
}
What is the problem here?
The problem is that you define your task and then redefine it. So only the latter definition is taken into account. You cannot separate task definition and its dependency on another task. Also you're using a couple of outdated things in sbt:
use taskKey macro and you don't need to think about task name, because it's the same as the key name:
val deploy = taskKey[Unit]("Copies assembly jar to remote location")
val deployAndRun = taskKey[Unit]("Deploy and run application.")
Then you can refer to them as deploy and deployAndRun both in build.sbt and in the sbt shell
replace <<= with := and keyname map { (keyvalue) => ... } with just keyname.value. Things are more concise and easier to write.
You can read more about Migrating from sbt 0.13.x.
So here's your deployAndRun task definition with these changes:
deployAndRun := {
val d = deploy.value
val asm = assembly.value
println(s"Running the script ..")
}
It's dependent both on deploy and assembly tasks and will run them both before doing anything else. You can also use dependsOn, but I think it's unnecessary here.
You may also be interested in looking into Defining a sequential task with Def.sequential and Defining a dynamic task with Def.taskDyn.

Scaldi: couldn't find bindings defined in typesafe config

Here is the issue. Let assume I have two mutable modules:
class DbModule extends Module { bind[JdbcBackend#Database] toProvider
inject[JdbcDriver].backend.Database.forURL(
inject[String]("db.url"),
inject[String]("db.username"),
inject[String]("db.password"), null,
inject[String]("db.driver")
) }
and here is the corresponding config:
resources/application.conf:
db { url="postgres url" username="db_user" password="db_password" driver="cc" }
Somewhere in the code I do:
implicit val inj = TypesafeConfigInjector() :: new AppModule
However this injector gives the following exception:
caldi.InjectException: No binding found with following identifiers:
* TypeTagIdentifier(String) * StringIdentifier(db.url)
The order in Scaldi is important: the binding is resolved from left to right.
The :: operator, as stated in the docs, composes two injectors by inverting the operands. Thus, in your case, AppModule is resolved first, hence it cannot find the config params injected.
To solve your problem, use the ++ operator to keep your injectors in order.
I hope this is helpful.

Setting a key local.sbt so that it is applied in a multi-project

How do I set key in a local.sbt in such a way that every subproject finds it ?
I'm trying to use Coursier plugin in a multi project, but since I'm testing it, I'm trying not to check it in in our git repo.
So I put it in my project/local.sbt and I was trying to set coursierUseSbtCredentials := true in a local.sbt.
This has no visible effect.
The authenticated nexus is defined in the commonSettings val in my build.sbt
val commonSettings = Seq(
...
resolvers += "my-nexus" at "http://blah",
credentials += ...
)
which every sub-project uses with .settings(commonSettings) (as per best-practices guide)
If I put coursierUseSbtCredentials := true in commonSettings it does work, but then I'd have to add it in my build.sbt, which I would rather not do.
How do I set this key so that every subproject can see it and in such a way that it is external to the build.sbt file ? (e.g. local.sbt ?)
Create a local plugin at project/SetupCoursierPlugin.scala:
import sbt._
import coursier.CoursierPlugin, CoursierPlugin.autoImport._
object SetupCoursierPlugin extends AutoPlugin {
override def requires = CoursierPlugin
override def trigger = allRequirements
override def projectSettings = Seq(
coursierUseSbtCredentials := true
)
}

Sbt Plugin, Default File SettingKey

I have a settingKey that I have defined in project/build.scala
val databasePropertiesFile = settingKey[File]("The file we use to grab the database login configuration.")
And I want to assign it a default value based on the sourceDirctory, something like this, but it doesn't compile:
databasePropertiesFile := {
sourceDirectory / "db/devel.properties"
}
What is the magic I must perform to set a default File?
The magic is ".value":
databasePropertiesFile := {
sourceDirectory.value / "db" / "devel.properties"
}

Cleanest way in Gradle to get the path to a jar file in the gradle dependency cache

I'm using Gradle to help automate Hadoop tasks. When calling Hadoop, I need to be able to pass it the path to some jars that my code depends on so that Hadoop can send that dependency on during the map/reduce phase.
I've figured out something that works, but it feels messy and I'm wondering if there's a feature I'm missing somewhere.
This is a simplified version of my gradle script that has a dependency on the solr 3.5.0 jar, and a findSolrJar task that iterates through all of the jar files in the configuration to find the right one:
apply plugin: 'groovy'
repositories {
mavenCentral()
}
dependencies {
compile 'org.apache.solr:solr-solrj:3.5.0'
}
task findSolrJar() {
println project.configurations.compile*.toURI().find { URI uri -> new File(uri).name == 'solr-solrj-3.5.0.jar'}
}
running this gives me output like this:
gradle findSolrJar
file:/Users/tnaleid/.gradle/caches/artifacts-8/filestore/org.apache.solr/solr-solrj/3.5.0/jar/74cd28347239b64fcfc8c67c540d7a7179c926de/solr-solrj-3.5.0.jar
:findSolrJar UP-TO-DATE
BUILD SUCCESSFUL
Total time: 2.248 secs
Is there a better way to do this?
Your code can be simplified a bit, for example project.configurations.compile.find { it.name.startsWith("solr-solrj-") }.
You can also create a dedicated configuration for an artifact, to keep it clean; and use asPath if the fact that it can potentially return several locations works well for your use case (happens if it resolves same jar in several locations):
configurations {
solr
}
dependencies {
solr 'org.apache.solr:solr-solrj:3.5.0'
}
task findSolrJars() {
println configurations.solr.asPath
}
To avoid copy-paste, in case you as well need that jar in compile configuration, you may add this dedicated configuration into compile one, like:
dependencies {
solr 'org.apache.solr:solr-solrj:3.5.0'
compile configurations.solr.dependencies
}
I needed lombok.jar as a java build flag to gwt builds this worked great !
configurations {
lombok
}
dependencies {
lombok 'org.projectlombok:lombok+'
}
ext {
lombok = configurations.lombok.asPath
}
compileGwt {
jvmArgs "-javaagent:${lombok}=ECJ"
}
I was surprised that the resolution worked early enough in the configuraiton phase, but it does.
Here is how I did it:
project.buildscript.configurations.classpath.each {
String jarName = it.getName();
print jarName + ":"
}
I recently had this problem as well. If you are building a java app, the problem at hand is normally that want to get the group:module (groupId:artifactId) to path-to-jar mapping (i.e. the version is not a search criteria as in one app there is normally only one version of each specific jar).
In my gradle 5.1.1 (kotlin-based) gradle build I solved this problem with:
var spec2File: Map<String, File> = emptyMap()
configurations.compileClasspath {
val s2f: MutableMap<ResolvedModuleVersion, File> = mutableMapOf()
// https://discuss.gradle.org/t/map-dependency-instances-to-file-s-when-iterating-through-a-configuration/7158
resolvedConfiguration.resolvedArtifacts.forEach({ ra: ResolvedArtifact ->
s2f.put(ra.moduleVersion, ra.file)
})
spec2File = s2f.mapKeys({"${it.key.id.group}:${it.key.id.name}"})
spec2File.keys.sorted().forEach({ it -> println(it.toString() + " -> " + spec2File.get(it))})
}
The output would be some like:
:jing -> /home/tpasch/scm/db-toolchain/submodules/jing-trang/build/jing.jar
:prince -> /home/tpasch/scm/db-toolchain/lib/prince-java/lib/prince.jar
com.github.jnr:jffi -> /home/tpasch/.gradle/caches/modules-2/files-2.1/com.github.jnr/jffi/1.2.18/fb54851e631ff91651762587bc3c61a407d328df/jffi-1.2.18-native.jar
com.github.jnr:jnr-constants -> /home/tpasch/.gradle/caches/modules-2/files-2.1/com.github.jnr/jnr-constants/0.9.12/cb3bcb39040951bc78a540a019573eaedfc8fb81/jnr-constants-0.9.12.jar
com.github.jnr:jnr-enxio -> /home/tpasch/.gradle/caches/modules-2/files-2.1/com.github.jnr/jnr-enxio/0.19/c7664aa74f424748b513619d71141a249fb74e3e/jnr-enxio-0.19.jar
After that, it is up to you to do something useful with this Map. In my case I add some --path-module options to my Java 11 build like this:
val patchModule = listOf(
"--patch-module", "commons.logging=" +
spec2File["org.slf4j:jcl-over-slf4j"].toString(),
"--patch-module", "org.apache.commons.logging=" +
spec2File["org.slf4j:jcl-over-slf4j"].toString()
)
patchModule.forEach({it -> println(it)})
tasks {
withType<JavaCompile> {
doFirst {
options.compilerArgs.addAll(listOf(
"--release", "11",
"--module-path", classpath.asPath
) + patchModule)
// println("Args for for ${name} are ${options.allCompilerArgs}")
}
}
}

Resources