Docker push fails with fatal error msg="" - sbt

Using the sbt-docker plugin, I execute sbt dockerBuildAndPush and see the following output after assembly:
...
[info] Pushing docker image with name: 'myorg/myrepo'
[info] The push refers to a repository [myorg/myrepo] (len: 1)
[info] Sending image list
[info] Pushing repository myorg/myrepo (1 tags)
[info] 511136ea3c5a: Pushing
[info] 511136ea3c5a: Image already pushed, skipping
[info] 19df420c532f: Pushing
[info] 19df420c532f: Image already pushed, skipping
... 14 more of these message pairs ...
[info] 53531ebeee8d: Pushing
[info] 53531ebeee8d: Buffering to disk
[info] time="2015-04-08T20:50:50+10:00" level="fatal" msg=""
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.process.BasicIO$Streamed$.scala$sys$process$BasicIO$Streamed$$next$1(BasicIO.scala:48)
...
build.sbt includes:
docker <<= (docker dependsOn assembly)
dockerfile in docker := {
val artifact = (assemblyOutputPath in assembly).value
val artifactTargetPath = "/app/server.jar"
new Dockerfile {
from("java:8")
maintainer("MyOrg", "dev#myorg.com")
workDir("/app")
run("mkdir", "-p", "/app/data")
run("chown", "daemon", "/app/data")
user("daemon")
add(artifact, artifactTargetPath)
entryPoint("java", "-Xmx8g", "-jar", artifactTargetPath)
expose(8080)
}
}
imageNames in docker := Seq(
ImageName("myorg/myrepo")
)
Why is it failing?

Related

How to pass arguments to InputTask without entering sbt interactive mode?

With the following sample SBT build file, I can pass arguments to my InputTask from within the SBT Interactive Mode but not from without. Is there a way?
Sample build.sbt:
import complete.DefaultParsers._
lazy val sampleDoSomething = inputKey[Unit]("Will print arguments.")
lazy val commonSettings = Seq(
organization := "com.example",
version := "0.1.0-SNAPSHOT"
)
lazy val taskInputTaskProject = (project in file(".")).
settings(commonSettings: _*).
settings(
sampleDoSomething := {
println("Arguments: ")
val args = spaceDelimited("<arg>").parsed
args foreach println
}
)
Successfully invoking task from within SBT Interactive mode:
$ sbt
[info] Set current project to taskInputTaskProject (in build file:/study/sbt/input-tasks/)
> sampleDoSomething a b c
Arguments:
a
b
c
[success] Total time: 0 s, completed Mar 22, 2016 1:06:58 PM
Successfully Invoking task from command line without arguments:
$ sbt sampleDoSomething
[info] Set current project to taskInputTaskProject (in build file:/study/sbt/input-tasks/)
Arguments:
[success] Total time: 0 s, completed Mar 22, 2016 1:06:18 PM
Failure to invoke task from command line with arguments:
$ sbt sampleDoSomething a b c
[info] Set current project to taskInputTaskProject (in build file:/study/sbt/input-tasks/)
Arguments:
[success] Total time: 0 s, completed Mar 22, 2016 1:06:44 PM
[error] Not a valid command: a
[error] Expected 'all'
[error] Not a valid project ID: a
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: a
[error] a
[error] ^
sbt "sampleDoSomething a b c"
See doc: http://www.scala-sbt.org/0.13/docs/Running.html#Batch+mode
Cheers

ivyPaths in globalSettings for a plugin

object BlaBlaPlugin extends AutoPlugin {
object autoImport {
lazy val blabla = settingKey[Unit]("")
}
import autoImport._
override lazy val globalSettings = Seq(
blabla := println(ivyPaths.value.ivyHome.get.getPath)
)
}
I get:
[error] Reference to undefined setting:
[error]
[error] */*:ivyPaths from */*:blabla ((BlaBlaPlugin) BlaBlaPlugin.scala:11)
Isn't ivyPaths defined at a Global scope?
I can access it in .sbt/global.sbt but not in the globalSettings of a plugin.
Cheers
There are several issues with the plugin.
projectSettings
Isn't ivyPaths defined at a Global scope?
No it's not.
> inspect ivyPaths
[info] Setting: sbt.IvyPaths = sbt.IvyPaths#13e5338f
[info] Description:
[info] Configures paths used by Ivy for dependency management.
[info] Provided by:
[info] {file:/Users/xxx/foo/}root/*:ivyPaths
[info] Defined at:
[info] (sbt.Classpaths) Defaults.scala:1128
[info] Dependencies:
[info] *:appConfiguration
[info] *:baseDirectory
[info] Reverse dependencies:
[info] *:cleanCacheIvyDirectory
[info] *:ivyConfiguration
[info] *:blabla
[info] Delegates:
[info] *:ivyPaths
[info] {.}/*:ivyPaths
[info] */*:ivyPaths
This shows that ivyPaths is in the format of {file:/Users/xxx/foo/}root/*:ivyPaths, which is scoped to the current project root. If you're in the mood for Star Wars coding (use the source, Luke) Defaults.scala is usually a good place to start.
So the first thing to change would be to use projectSettings.
override requires
Also note that sbt's builtin settings and tasks are loaded using auto plugins, so if you're depending on them, you need to make sure your plugins are placed after them. To do that you need to override requires method as follows:
override def requires: Plugins = sbt.plugins.IvyPlugin
putting it all together
import sbt._
import Keys._
object BlaBlaPlugin extends AutoPlugin {
override def requires: Plugins = sbt.plugins.IvyPlugin
object autoImport {
lazy val blabla = settingKey[Unit]("")
}
import autoImport._
override lazy val projectSettings = Seq(
blabla := println(ivyPaths.value.ivyHome.get.getPath)
)
}

generate resources from an AutoPlugin in sbt

I have created the following plugin, like so many plugins before it...
/**
* This plugin automatically generates a version number based on the configured
* minor version and today's date and time.
*/
object DateVersionPlugin extends AutoPlugin {
//override def trigger = allRequirements
def dateFormat (fmt : String) =
new java.text.SimpleDateFormat(fmt).format(
new java.util.Date()
)
def versionNumber (majorVersion : String,
versionTrimFront : Int,
versionDateFormat : String) =
"%s.%s".format(
majorVersion, dateFormat(versionDateFormat).substring(versionTrimFront)
)
/**
* Defines all settings/tasks that get automatically imported,
* when the plugin is enabled
*/
object autoImport {
/**
* The number of values to trim off the front of the date string.
*
* This is used to achieve a date string which doesn't include the
* present millenium. The century, a stretch, can be imagined as
* conceivable - but few civilizations have lasted multiple millennia.
*/
lazy val versionTrimFront = settingKey[Int]("Number of characters to remove from front of date")
/**
* The format to use for generating the date-part of this version number.
*/
lazy val versionDateFormat = settingKey[String]("The date format to use for versions")
/**
* The major version to place at the front of the version number.
*/
lazy val versionMajor = settingKey[String]("The major version number, default 0")
/**
* The filename of the generated resource.
*/
lazy val versionFilename = settingKey[String]("The filename of the file to generate")
/**
* The name of the property to place in the version number.
*/
lazy val versionPropertyName = settingKey[String]("The name of the property to store as version")
/**
* Generate a version.conf configuration file.
*
* This task generates a configuration file of the name specified in the
* settings key.
*/
lazy val generateVersionConf = taskKey[Seq[File]]("Generates a version.conf file.")
}
import autoImport._
/**
* Provide default settings
*/
override def projectSettings: Seq[Setting[_]] = Seq(
versionFilename := "version.conf",
versionPropertyName := "version",
versionDateFormat := "YY.D.HHmmss",
versionTrimFront := 0,
versionMajor := "0",
(version in Global) := versionNumber(versionMajor.value,
versionTrimFront.value, versionDateFormat.value),
generateVersionConf <<=
(resourceManaged in Compile, version, versionFilename, versionPropertyName, streams) map {
(dir, v, filename, propertyName, s) =>
val file = dir / filename
val contents = propertyName + " = \"" + v.split("-").head + "\""
s.log.info("Writing " + contents + " to " + file)
IO.write(file, contents)
Seq(file)
},
resourceGenerators in Compile += generateVersionConf.taskValue
)
}
The generate-version-conf task behaves as desired, generating the file I'm looking for. The version setting is updated as expected by projects that use this plugin. But yet the following are not happening and I'm not clear why:
The config file is not generated by compile.
The config file is not packaged in the jar by package.
The config file is not in the classpath when the run task is used.
Note I have also tried a dozen or so variations on this, and I have further tried:
resourceGenerators in Compile <+= generateVersionConf
Which as I understand it should result in more or less the same behavior.
Inspecting the runtime attributes of this, I see some of the settings are applied successfully:
> inspect version
[info] Setting: java.lang.String = 0.15.338.160117
[info] Description:
[info] The version/revision of the current module.
[info] Provided by:
[info] */*:version
[info] Defined at:
[info] (com.quantcast.sbt.version.DateVersionPlugin) DateVersionPlugin.scala:101
[info] Reverse dependencies:
[info] *:isSnapshot
[info] *:generateVersionConf
[info] *:projectId
[info] Delegates:
[info] *:version
[info] {.}/*:version
[info] */*:version
[info] Related:
[info] */*:version
However, this is not true for compile:resourceGenerators, which shows that it still maintains the defaults.
> inspect compile:resourceGenerators
[info] Setting: scala.collection.Seq[sbt.Task[scala.collection.Seq[java.io.File]]] = List(Task(_))
[info] Description:
[info] List of tasks that generate resources.
[info] Provided by:
[info] {file:/home/scott/code/quantcast/play/sbt-date-version/sbt-test/}root/compile:resourceGenerators
[info] Defined at:
[info] (sbt.Defaults) Defaults.scala:207
[info] (sbt.Defaults) Defaults.scala:208
[info] Dependencies:
[info] compile:discoveredSbtPlugins
[info] compile:resourceManaged
[info] Reverse dependencies:
[info] compile:managedResources
[info] Delegates:
[info] compile:resourceGenerators
[info] *:resourceGenerators
[info] {.}/compile:resourceGenerators
[info] {.}/*:resourceGenerators
[info] */compile:resourceGenerators
[info] */*:resourceGenerators
[info] Related:
[info] test:resourceGenerators
My question is (now that I've continued to research this more), what could be keeping my changes to (generateResources in Compile) from being applied?
If this plugin needs to require the JvmPlugin. This is because the JvmPlugin defines the setting dependencies. Without it, apparently the compile:resourceGenerators setting is being overwritten by the defaults, redefining the set of resource generators to Nil and building from there.
So the solution is to include the following line in the AutoPlugin definition.
override def requires = plugins.JvmPlugin

Why does publishing plugin project fail with RuntimeException: Repository for publishing is not specified?

I am trying to publish an SBT plugin to a repository. I'm not sure if this has any relevance, but our plugin loads the sbt-twirl plugin - Googling around, it seems like publishConfiguration might be overriden:
new PublishConfiguration(None, "dotM2", arts, Seq(), level)
When I run the publish task, artifacts are deployed to the repo, but the sbt task then fails:
sbt (my-sbt-plugin)> publish
[info] Loading global plugins from ...
...
[info] Done packaging.
[info] published sbt-my-sbt-plugin to http://my.repo.com/.../sbt-my-sbt-plugin-0.1-SNAPSHOT.jar
java.lang.RuntimeException: Repository for publishing is not specified.
.... stack trace here ....
[error] (my-sbt-plugin/*:publishConfiguration) Repository for publishing is not specified.
What is causing the error, and what could I do to stop the publishing from failing?
** Update ** Here is inspect publish
sbt (my-sbt-plugin)> inspect publish
[info] Task: Unit
[info] Description:
[info] Publishes artifacts to a repository.
[info] Provided by:
[info] {file:/path/to/my-sbt-plugin/}my-sbt-plugin/*:publish
[info] Defined at:
[info] (sbt.Classpaths) Defaults.scala:988
[info] Dependencies:
[info] my-sbt-plugin/*:ivyModule
[info] my-sbt-plugin/*:publishConfiguration
[info] my-sbt-plugin/*:publish::streams
[info] Delegates:
[info] my-sbt-plugin/*:publish
[info] {.}/*:publish
[info] */*:publish
[info] Related:
[info] plugin/*:publish
Here's how I've configured publishing (with some of the plugin settings, excluding libraryDependencies and 1 or 2 other settings)
lazy val plugin = project
.settings(publishSbtPlugin: _*)
.settings(
name := "my-sbt-plugin",
sbtPlugin := true,
addSbtPlugin("com.typesafe.sbt" % "sbt-twirl" % "1.0.2")
)
def publishSbtPlugin = Seq(
publishMavenStyle := true,
publishTo := {
val myrepo = "http://myrepo.tld/"
if (isSnapshot.value) Some("The Realm" at myrepo + "snapshots")
else Some("The Realm" at myrepo + "releases")
},
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
)
tl;dr Don't use lazy val plugin = project to define a project (for unknown yet reasons)
After few comments it turned out that the issue was that the name of the project plugin as defined using lazy val plugin = project. It seems that the name is somehow reserved. Change the project's name to any other name than plugin and start over.
Specifying a project name other than "plugin" resolved the issue. I simplified the build definition a bit by removing a redundant build.sbt in 1 of the projects and am just using a full build definition in project directory. The root project that hosts the multi-project build is also reconfigured for no publishing:
lazy val root =
Project("sbt-my-plugin-root", file("."))
.settings(noPublishing: _*)
.aggregate(sbtMyPluginModule)
lazy val sbtMyPluginModule =
Project("sbt-my-plugin-module", file("sbt-my-plugin-module"))
.settings(publishSbtPlugin: _*)
.settings(
name := "sbt-my-plugin-module",
organization := "com.my.org",
sbtPlugin := true
)
lazy val noPublishing = seq(
publish := (),
publishLocal := ()
)
lazy val publishSbtPlugin = Seq(
publishMavenStyle := true,
publishArtifact in Test := false,
publishTo := {
val myrepo = "http://myrepo.tld/"
if (isSnapshot.value) Some("The Realm" at myrepo + "snapshots")
else Some("The Realm" at myrepo + "releases")
},
credentials += Credentials(Path.userHome / ".ivy2" / ".credentials")
)
if you trying this on your local then use publishLocal (not publish) as follows:
sbt clean compile publish-local

Why does inConfig(conf)(settings) not pick some settings?

I thought that inConfig(conf)(settings) would copy all settings into the given configuration. But this doesn't seem to do what I would expect.
Given a configuration:
lazy val Monkjack: Configuration = config("monkjack")
Then I do:
inConfig(Monkjack)(Defaults.compileSettings)
So I can do compile as I would expect:
sbt clean monkjack:compile
[info] Compiling 17 Scala sources to ...
[success] Total time: 9 s, completed 01-Sep-2014 09:40:41
So now I want to adjust the scalac options when using this new config (the actual options are irrevlant, this one is just useful because it has verbose output so its easy to see if its being used or not):
scalacOptions in Monkjack := Seq("-Yshow-syms")
When I monjack:compile, I don't see this option being triggered. It's like the above line wasn't added. But if I also add in the following lines it works!
sources in Monkjack := (sources in Compile).value
sourceDirectory in Monkjack := (sourceDirectory in Compile).value,
So why do I need the final two lines and what is inConfig actually doing if its not doing what I expect. As an additional oddity, when I do the above, although it works, I get two compile phases, one going to target/classes and one going to target/monkjack-classes.
Edit (inspect without the sources/sourceDirectory settings)
> inspect tree monkjack:compile
[info] monkjack:compile = Task[sbt.inc.Analysis]
[info] +-monkjack:compile::compileInputs = Task[sbt.Compiler$Inputs]
[info] | +-*:compilers = Task[sbt.Compiler$Compilers]
[info] | +-monkjack:sources = Task[scala.collection.Seq[java.io.File]]
[info] | +-*/*:maxErrors = 100
[info] | +-monkjack:incCompileSetup = Task[sbt.Compiler$IncSetup]
[info] | +-monkjack:compile::streams = Task[sbt.std.TaskStreams[sbt.Init$ScopedKey[_ <: Any]]]
[info] | | +-*/*:streamsManager = Task[sbt.std.Streams[sbt.Init$ScopedKey[_ <: Any]]]
[info] | |
[info] | +-*/*:sourcePositionMappers = Task[scala.collection.Seq[scala.Function1[xsbti.Position, scala.Option[xsbti.Position]]]]
[info] | +-monkjack:dependencyClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-monkjack:classDirectory = target/scala-2.11/monkjack-classes
[info] | +-monkjack:scalacOptions = Task[scala.collection.Seq[java.lang.String]]
[info] | +-*:javacOptions = Task[scala.collection.Seq[java.lang.String]]
[info] | +-*/*:compileOrder = Mixed
[info] |
[info] +-monkjack:compile::streams = Task[sbt.std.TaskStreams[sbt.Init$ScopedKey[_ <: Any]]]
[info] +-*/*:streamsManager = Task[sbt.std.Streams[sbt.Init$ScopedKey[_ <: Any]]]
[info]
tl;dr No sources for a new configuration means no compilation and hence no use of scalacOptions.
From When to define your own configuration:
If your plugin introduces either a new set of source code or its own library dependencies, only then you want your own configuration.
inConfig does the (re)mapping only so all the keys are initialised for a given scope - in this case the monkjack configuration.
In other words, inConfig computes values for the settings in a new scope.
The settings of much influence here are sourceDirectory and sourceManaged that are set in sourceConfigPaths (in Defaults.sourceConfigPaths) as follows:
lazy val sourceConfigPaths = Seq(
sourceDirectory <<= configSrcSub(sourceDirectory),
sourceManaged <<= configSrcSub(sourceManaged),
...
)
configSrcSub gives the answer (reformatted slightly to ease reading):
def configSrcSub(key: SettingKey[File]): Initialize[File] =
(key in ThisScope.copy(config = Global), configuration) { (src, conf) =>
src / nameForSrc(conf.name)
}
That leads to the answer that if you moved your sources to src/monkjack/scala that would work fine. That's described in Scoping by configuration axis:
A configuration defines a flavor of build, potentially with its own classpath, sources, generated packages, etc. (...)
By default, all the keys associated with compiling, packaging, and running are scoped to a configuration and therefore may work differently in each configuration.

Resources