How can I disable jar compression in sbt using sbtassembly? - sbt

I'm interested in building uncompressed jar files to make my rsync faster when only a few classes change and so far, i can't figure out how to tell sbtassembly to disable compression.
server > inspect assembly
[info] Task: java.io.File
[info] Description:
[info] Builds a single-file deployable jar.
[info] Provided by:
[info] {file:/.../}server/*:assembly
[info] Dependencies:
[info] server/*:assembly-merge-strategy(for assembly)
[info] server/*:assembly-output-path(for assembly)
[info] server/*:package-options(for assembly)
[info] server/*:assembly-assembled-mappings(for assembly)
[info] server/*:cache-directory
[info] server/*:test(for assembly)
[info] server/*:streams(for assembly)
[info] Delegates:
[info] server/*:assembly
[info] {.}/*:assembly
[info] */*:assembly
...
server > inspect assembly-option(for assembly)
[info] Setting: sbtassembly.AssemblyOption = AssemblyOption(true,true,true,<function1>)
[info] Description:
[info]
[info] Provided by:
[info] {file:/.../}server/*:assembly-option(for assembly)
[info] Dependencies:
[info] server/*:assembly-assemble-artifact(for package-bin)
[info] server/*:assembly-assemble-artifact(for assembly-package-scala)
[info] server/*:assembly-assemble-artifact(for assembly-package-dependency)
[info] server/*:assembly-excluded-files(for assembly)
...
AssemblyOption doesn't say anything about packaging, however, and the plugin seems to use sbt's own Package for that, so maybe there's a way to configure that? Package, in turn, calls IO.jar(...) to write the file. That uses withZipOutput to make a ZipOutputStream (or a JarOutputStream), on which i'd want to call setMethod(ZipOutputStream.STORED), but i can't.
Any ideas other than an sbt feature request?

There is no way to do this directly via sbt configuration, since sbt assumes that any files within zip and jar artifacts should be compressed.
One workaround is to unzip and re-zip (without compression) the jar file. You can do this by adding the following setting to your project (e.g. in build.sbt):
packageBin in Compile <<= packageBin in Compile map { file =>
println("(Re)packaging with zero compression...")
import java.io.{FileInputStream,FileOutputStream,ByteArrayOutputStream}
import java.util.zip.{CRC32,ZipEntry,ZipInputStream,ZipOutputStream}
val zis = new ZipInputStream(new FileInputStream(file))
val tmp = new File(file.getAbsolutePath + "_decompressed")
val zos = new ZipOutputStream(new FileOutputStream(tmp))
zos.setMethod(ZipOutputStream.STORED)
Iterator.continually(zis.getNextEntry).
takeWhile(ze => ze != null).
foreach { ze =>
val baos = new ByteArrayOutputStream
Iterator.continually(zis.read()).
takeWhile(-1 !=).
foreach(baos.write)
val bytes = baos.toByteArray
ze.setMethod(ZipEntry.STORED)
ze.setSize(baos.size)
ze.setCompressedSize(baos.size)
val crc = new CRC32
crc.update(bytes)
ze.setCrc(crc.getValue)
zos.putNextEntry(ze)
zos.write(bytes)
zos.closeEntry
zis.closeEntry
}
zos.close
zis.close
tmp.renameTo(file)
file
}
Now when you run package in sbt, the final jar file will be uncompressed, which you can verify with unzip -vl path/to/package.jar.

Related

How to publish webjar assets with publish/publishLocal in Play 2.3?

Since Play Framework 2.3 assets are packaged into one jar archive file. I would like to publish this jar automatically with the project, i.e. upon publish or publishLocal I want the assets jar to be published as well.
How to achieve that?
After inspect tree dist I managed to find the task playPackageAssets that generates the assets file:
[play-publish-webjar] $ inspect playPackageAssets
[info] Task: java.io.File
[info] Description:
[info]
[info] Provided by:
[info] {file:/Users/jacek/sandbox/play-publish-webjar/}root/*:playPackageAssets
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:641
[info] Dependencies:
[info] *:playPackageAssets::packageConfiguration
[info] *:playPackageAssets::streams
[info] Reverse dependencies:
[info] *:scriptClasspath
[info] universal:mappings
[info] Delegates:
[info] *:playPackageAssets
[info] {.}/*:playPackageAssets
[info] */*:playPackageAssets
A naive solution might be to attach the assets webjar as is generated by playPackageAssets to publishLocal task's artifacts. Add the following to build.sbt (the types are to show what you work with):
import play.PlayImport.PlayKeys._
packagedArtifacts in publishLocal := {
val artifacts: Map[sbt.Artifact, java.io.File] = (packagedArtifacts in publishLocal).value
val assets: java.io.File = (playPackageAssets in Compile).value
artifacts + (Artifact(moduleName.value, "asset", "jar", "assets") -> assets)
}
Repeat it for the other tasks you want to exhibit similar behaviour.
I'm however quite doubtful it's the best solution.

SBT Config extend vs DefaultSettings

If I define an SBT config with
val MyConfig = config("my") extend Test
is that basically the same as doing
val MyConfig = config("my")
val mySettings = inConfig(MyConfig)(Defaults.testSettings)
and then importing mySettings inside a build definition ?
No, calling extend method is not the same thing as calling inConfig. extend just returns a new configuration with passed in configurations prepended extendsConfigs, and it will not introduce any new settings.
When you add MyConfig into the project, it becomes part of the scoped key resolution path:
val MyConfig = config("my") extend Test
val root = (project in file(".")).
configs(MyConfig)
Suppose you type my:test in the sbt shell. Since test task is not found under my configuration, it will traverse extendsConfigs and check if the tasks are available under them. The first one it's going to hit is Test since we prepended it. You can check this by running inspect my:test:
root> inspect my:test
[info] Task: Unit
[info] Description:
[info] Executes all tests.
[info] Provided by:
[info] {file:/Users/eugene/work/quick-test/sbt-so/}root/test:test
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:365
[info] Delegates:
[info] my:test
[info] test:test
[info] runtime:test
[info] compile:test
[info] *:test
[info] {.}/my:test
[info] {.}/test:test
[info] {.}/runtime:test
[info] {.}/compile:test
[info] {.}/*:test
[info] */my:test
[info] */test:test
[info] */runtime:test
[info] */compile:test
[info] */*:test
[info] Related:
[info] test:test
"Provided by" says it delegated to root/test:test. This mechanism allows you to share some of the settings but override others, but you still have to know the inner wiring of the settings scoped to tasks etc, so it's tricky business. You probably already know, but I'll link to Additional test configurations, which specifically discusses configurations for testing.

Developing task to synchronize managed dependencies with another directory in SBT?

I'm trying to write a task which retrieves the dependency jars and puts them in a single directory at the project root. Here's what I have so far:
retrieveManaged := true
val libDir = settingKey[File]("the directory to retrieve dependency libraries to")
lazy val getLibs = taskKey[Unit]("retrieves all dependency libraries to the libDir")
libDir := baseDirectory.value / "libs"
getLibs := {
val a = update.value
sbt.IO.delete(libDir.value)
for (src <- (managedDirectory.value ** "*.jar").get) {
sbt.IO.copyFile(src, libDir.value / src.getName, true)
}
}
I use retrieveManaged := true so that sbt will copy the dependencies to managedDirectory. I then copy those libraries to a directory which I have defined. My questions are:
I want my task to depend on the update task, to ensure that the dependencies have been copied to managedDirectory first. If I understand How can I call another task from my SBT task?, I should be able to do this by calling update.value. But this doesn't seem to work.
Instead of copying the files, I'd really like to "synchronize". This means that only newly added files should be copied, and any files which no longer exist should be removed. How can I do this?
Update
Thanks to Jacek's suggestion, I was able to come up with the following solution, which addresses #1. I still need to figure out how to do #2 (synchronize instead of copy).
getLibs := {
sbt.IO.delete(libDir.value)
val depFiles = update.value.matching((s: String) => Set("compile", "runtime") contains s)
for (depFile <- depFiles) {
sbt.IO.copyFile(depFile, libDir.value / depFile.getName, true)
}
}
I think the answer is to really use the value of update task and parse its result. That would sort out the issue 1 and 2. If it does not, attach the symptoms to your question.
All val a = update.value does it to execute update task and assign its value to a. Since you don't use a at all, you lose all the previous information update gives you. To see what it is execute show update in the shell.
[triggeredby]> help update
Resolves and optionally retrieves dependencies, producing a report.
[triggeredby]> show update
[info] Updating {file:/Users/jacek/sandbox/so/triggeredby/}triggeredby...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Update report:
[info] Resolve time: 2152 ms, Download time: 114 ms, Download size: 0 bytes
[info] compile:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] runtime:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] test:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] provided:
[info] optional:
[info] compile-internal:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] runtime-internal:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] test-internal:
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] plugin:
[info] sources:
[info] docs:
[info] pom:
[info] scala-tool:
[info] org.scala-lang:scala-compiler:2.10.3: (Artifact(scala-compiler,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-compiler.jar)
[info] org.scala-lang:scala-library:2.10.3 (): (Artifact(scala-library,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-library.jar)
[info] org.scala-lang:scala-reflect:2.10.3 (): (Artifact(scala-reflect,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/scala-reflect.jar)
[info] org.scala-lang:jline:2.10.3: (Artifact(jline,jar,jar,None,List(),None,Map()),/Users/jacek/.sbt/boot/scala-2.10.3/lib/jline.jar)
[info] org.fusesource.jansi:jansi:1.4: (Artifact(jansi,jar,jar,None,ArraySeq(master),None,Map()),/Users/jacek/.ivy2/cache/org.fusesource.jansi/jansi/jars/jansi-1.4.jar)
[success] Total time: 3 s, completed Mar 14, 2014 12:16:26 AM
I think it's pretty much what you need - use the return value to fit your needs.

"publish" to local maven repo creates only scaladoc jars?

I'm using sbt 0.13.0 and Scala 2.10.3
I wanted to publish an artifact to my local maven repo so I added the following to build.sbt:
publishMavenStyle := true
publishTo := Some(Resolver.file("file", new File(Path.userHome.absolutePath+"/.m2/repository")))
artifactName := {
(sv: ScalaVersion, module: ModuleID, artifact: Artifact) =>
artifact.name + "-" + module.revision + "." + artifact.extension
}
When I use publish cmd jars are created in my home .m2 directory but there are of the same content meaning they only contain scaladocs - only html, css and js files.
This is my second time I'm publishing my artifact in this project. Last time it worked perfectly. Most of the classes changed their packages, can this be the reason, and how to fix this?
Remove artifactName and it should work fine again.
Why do you redefine it (as it now breaks publish)? What's the initial goal?
[sbt-0-13-1]> about
[info] This is sbt 0.13.1
[info] The current project is {file:/Users/jacek/sandbox/so/sbt-0.13.1/}sbt-0-13-1 0.1-SNAPSHOT
[info] The current project is built against Scala 2.10.3
[info] Available Plugins: com.typesafe.sbt.SbtGit, com.typesafe.sbt.SbtProguard, growl.GrowlingTests, np.Plugin, com.timushev.sbt.updates.UpdatesPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.3
[sbt-0-13-1]> help artifactName
Function that produces the artifact name from its definition.
[sbt-0-13-1]> inspect artifactName
[info] Setting: scala.Function3[sbt.ScalaVersion, sbt.ModuleID, sbt.Artifact, java.lang.String] = <function3>
[info] Description:
[info] Function that produces the artifact name from its definition.
[info] Provided by:
[info] */*:artifactName
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:533
[info] Reverse dependencies:
[info] *:makePom::artifactPath
[info] Delegates:
[info] *:artifactName
[info] {.}/*:artifactName
[info] */*:artifactName
[info] Related:
[info] */*:artifactName
See how the default implementation is defined in sbt.Artifact.

Accessing managedClasspath of sbt subprojects

I'm converting an sbt 0.7.x build script to sbt 0.11.2. I'm writing a task to collect various JARs together from subprojects. In the old build, part of the task does the following:
deployedProjects.foreach {
p: BasicScalaProject =>
p.managedClasspath(config("compile")) --- p.managedClasspath(config("provided"))
// etc
}
How can I do the equivalent in sbt 0.11?
Updated to add:
In particular:
How can I write a task that depends on a list of settings/tasks? For example, how would I write a task that depends on all the managedClasspaths from a List of subprojects (without bundling it all into a tuple).
Is there a particular scope for getting the managed jars that are or are not marked as "provided"?
In sbt 0.11.x there is the task managedClasspath:
> inspect managed-classpath
[info] Task: scala.collection.Seq[sbt.Attributed[java.io.File]]
[info] Description:
[info] The classpath consisting of external, managed library dependencies.
[info] Provided by:
[info] {file:/Users/heiko/tmp/test/}default-f3fb6c/compile:managed-classpath
[info] Dependencies:
[info] compile:classpath-configuration
[info] compile:classpath-types
[info] compile:update
[info] Reverse dependencies:
[info] compile:external-dependency-classpath
[info] Delegates:
[info] compile:managed-classpath
[info] *:managed-classpath
[info] {.}/compile:managed-classpath
[info] {.}/*:managed-classpath
[info] */compile:managed-classpath
[info] */*:managed-classpath
[info] Related:
[info] test:managed-classpath
[info] runtime:managed-classpath
Looking at the delegates you see that you can scope this task to various configurations, e.g. compile:
> show compile:managed-classpath
[info] Updating {file:/Users/heiko/tmp/test/}default-f3fb6c...
[info] Resolving org.scala-lang#scala-library;2.9.1 ...
[info] Done updating.
[info] ArraySeq(Attributed(/Users/heiko/.sbt/boot/scala-2.9.1/lib/scala-library.jar))

Resources