There are 2 questions,
I defined publishTo with a resolver "abc", which is not in the external ivysettings.xml. When I do the publish, sbt complains resolver "abc" is undefined.
I defined an artifact, which is a zipped package, to be published, and the corresponding settings are as follows,
val ZIP = Configurations.config("app")
val artifact = SettingKey[Artifact]("artifact")
val pack = TaskKey[File]("pack")
val settings = Seq(artifact := Artifact(name.value, "zip", "zip", Some("app"), List(ZIP), None)) ++ addArtifact(artifact, pack).settings
It works pretty well when dependencies are managed by sbt itself, but totally cannot work (meaning the local publish just ignores publish of this artifact) if they are managed by ivy. How can I get over these?
Seems like the custom artifact settings does not work only if they are imported with an auto plugin, is it a bug or am I missing something?
thanks to this post:
How can a default task be overridden from an AutoPlugin?
so the artifacts settings must be introduced after sbt's own plugins are imported, otherwise it'd be overwritten.
Related
I have an sbt project where docker:publishLocal will create a docker image on my machine for testing, and docker:publish will publish the image to a repository and also publish jar files from the build to a repository.
If my project is a snapshot, I would like to disable publishing to the repositories, while still being able to build the local image.
ThisBuild / publishArtifact := ! isSnapshot.value
does the right thing for the publish command, but it also disables publishLocal.
I want to write something like
if (isSnapshot.value) {
publish := { }
}
but that gives me an error that I do not understand at all:
[info] Loading project definition from /Users/dev/project
/Users/dev/build.sbt:1: error: type mismatch;
found : Unit
required: sbt.internal.DslEntry
if (isSnapshot.value) {
^
Past experience dictates that redefining publish to conditionally call the original version won't work, as
publish := {
if (!isSnapshot.value) publish.value
}
gives warnings that the task is always evaluated.
Is there a way to do this?
The problem with this code is that it evaluates publish.value regardless of the if structure. I recommend reading the documentation on task dependencies. If you want to "delay" the evaluation of a task in one of the if branches, you need to use dynamic task definition:
publish := Def.taskDyn {
if (isSnapshot.value)
Def.task {} // doing nothing
else
Def.task { publish.value } // could be written as just publish
}.value
But apart from fixing your code, you should be aware that there is a special setting for the functionality you want, it's called skip:
publish/skip := isSnapshot.value
Another thing to notice, is the scoping. If you want to override docker:publish, which is the same as Docker/publish in the new syntax, you should add this Docker/ scope prefix to every mention of publish in the code above.
I have a multi-project build in SBT where some projects should aggregate dependencies and contain no code. So then clients could depend on these projects as a single dependency instead of directly depending on all of their aggregated dependencies. With Maven, this is a common pattern, e.g. when using Spring Boot.
In SBT, I figured I can suppress the generation of the empty artifacts by adding this setting to these projects:
packagedArtifacts := Classpaths.packaged(Seq(makePom)).value
However, the makePom task writes <packaging>jar</packaging> in the generated POM. But now that there is no JAR anymore, this should read <packaging>pom</packaging> instead.
How can I do this?
This question is a bit old, but I just came across the same issue and found a solution. The original answer does point to the right page where this info can be found, but here is an example. It uses the pomPostProcess setting to transform the generated POM right before it is written to disk. Essentially, we loop over all the XML nodes, looking for the element we care about and then rewrite it.
import scala.xml.{Node => XmlNode, NodeSeq => XmlNodeSeq, _}
import scala.xml.transform._
pomPostProcess := { node: XmlNode =>
val rule = new RewriteRule {
override def transform(n: XmlNode): XmlNodeSeq = n match {
case e: Elem if e != null && e.label == "packaging" =>
<packaging>pom</packaging>
case _ => n
}
}
new RuleTransformer(rule).transform(node).head
},
Maybe you could modify the result pom as described here: Modifying the generated POM
You can disable publishing the default artifacts of JAR, sources, and docs, then opt in explicitly to publishing the POM. sbt produces and publishes a POM only, with <packaging>pom</packaging>.
// This project has no sources, I want <packaging>pom</pom> with dependencies
lazy val bundle = project
.dependsOn(moduleA, moduleB)
.settings(
publishArtifact := false, // Disable jar, sources, docs
publishArtifact in makePom := true,
)
lazy val moduleA = project
lazy val moduleB = project
lazy val moduleC = project
Run sbt bundle/publishM2 to verify the POM in ~/.m2/repository.
I dare say this is almost intuitive, a rare moment of pleasant surprise with sbt 😅
I confirmed this with current sbt 1.3.9, and 1.0.1, the oldest launcher I happen to have installed on my machine.
The Artifacts page in the reference docs may be helpful, perhaps this trick should be added there.
We have an internal Nexus repository that we use to publish artifacts to and also to cache external dependencies (from Maven Central, Typesafe, etc.)
I want to add the repository as a resolver in my SBT build, under the following restrictions:
The settings need to be part of the build declaration (either .sbt or .scala, but not in the "global" sbt settings
If a dependency exists in the local repository, it should be taken from there. I don't want to have to access the network to get all the dependencies for every build.
If a dependency doesn't exist locally, sbt should first try to get it from the Nexus repository before trying the external repositories.
I saw several similar questions here, but didn't find any solution that does exactly this. Specifically, the code I currently have is:
externalResolvers ~= { rs => nexusResolver +: rs }
But when I show externalResolvers the Nexus repo appears before the local one.
So far, I've come up with the following solution:
externalResolvers ~= { rs =>
val grouped = rs.groupBy(_.isInstanceOf[FileRepository])
val fileRepos = grouped(true)
val remoteRepos = grouped(false)
fileRepos ++ (nexusResolver +: remoteRepos)
}
It works, but is kinda dirty... If anyone has a "cleaner" solution, I'd love to hear it.
EDIT:
Since I put up the bounty, I thought I should restate the question
How can a SBT project P, with two sub-projects A and B, set up B to have a plugin dependency on A, which is a SBT plugin?
Giving P a plugin dependency on A does not work, since A depends on other things in P, which results in a circular dependency graph
It has to be a plugin dependency, for A is a plugin needed to run Bs test suite.
dependsOn doesn't work, because, well, it has to be a plugin dependency
I'd like to know either of
How to do this, or
Why this is impossible, and what the next best alternatives are.
EDIT: clarified that it's a plugin-dependency, since build-dependency is ambiguous
When you have a multi-project build configuration with "project P and two sub-projects A and B" it boils down to the following configuration:
build.sbt
lazy val A, B = project
As per design, "If a project is not defined for the root directory in the build, sbt creates a default one that aggregates all other projects in the build." It means that you will have an implicit root project, say P (but the name is arbitrary):
[plugin-project-and-another]> projects
[info] In file:/Users/jacek/sandbox/so/plugin-project-and-another/
[info] A
[info] B
[info] * plugin-project-and-another
That gives us the expected project structure. On to defining plugin dependency between B and A.
The only way to define a plugin in a SBT project is to use project directory that's the plugins project's build definition - "A plugin definition is a project in <main-project>/project/." It means that the only way to define a plugin dependency on the project A is to use the following:
project/plugins.sbt
addSbtPlugin("org.example" % "example-plugin" % "1.0")
lazy val plugins = project in file(".") dependsOn(file("../A"))
In this build configuration, the plugins project depends on another SBT project that happens to be our A that's in turn a plugin project.
A/build.sbt
// http://www.scala-sbt.org/release/docs/Extending/Plugins.html#example-plugin
sbtPlugin := true
name := "example-plugin"
organization := "org.example"
version := "1.0"
A/MyPlugin.scala
import sbt._
object MyPlugin extends Plugin
{
// configuration points, like the built in `version`, `libraryDependencies`, or `compile`
// by implementing Plugin, these are automatically imported in a user's `build.sbt`
val newTask = taskKey[Unit]("A new task.")
val newSetting = settingKey[String]("A new setting.")
// a group of settings ready to be added to a Project
// to automatically add them, do
val newSettings = Seq(
newSetting := "Hello from plugin",
newTask := println(newSetting.value)
)
// alternatively, by overriding `settings`, they could be automatically added to a Project
// override val settings = Seq(...)
}
The two files - build.sbt and MyPlugin.scala in the directory A - make up the plugin project.
The only missing piece is to define the plugin A's settings for the project B.
B/build.sbt
MyPlugin.newSettings
That's pretty much it what you can do in SBT. If you want to have multi-project build configuration and have a plugin dependency between (sub)projects, you don't have much choice other than what described above.
With that said, let's see if the plugin from the project A is accessible.
[plugin-project-and-another]> newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:31 AM
[plugin-project-and-another]> B/newTask
Hello from plugin
[success] Total time: 0 s, completed Feb 13, 2014 2:29:36 AM
[plugin-project-and-another]> A/newTask
[error] No such setting/task
[error] A/newTask
[error] ^
As you may have noticed, newTask (that comes from the plugin from the project A) is available in the (default) root project and the project B, but not in A.
As Jacek said, it cannot be done as I would like, as a subproject cannot have a SBT plugin that the root project does not. On the other hand, this discussion on the mailing list contains several alternatives, and would no doubt be useful to anyone who comes across this question in the future.
EDIT: Well, in the end the alternatives mentioned (sbt scripted, etc) were hard and clunky to use. My final solution was to just have a separate project (not subproject) inside the repo that depends on the original project via it's ivy coordinates, and using bash to publishLocal the first project, going into the second project and running its tests
sbt publishLocal; cd test; sbt test; cd ..
I always thought the point of something like SBT was to avoid doing this kind of bash gymnastics, but desperate times call for desperate measures...
This answer may include the solution https://stackoverflow.com/a/12754868/3189923 .
From that link, in short, set exportJars := true and to obtain jar file paths for a (sub)project exportedProducts in Compile.
Leaving the facts about plugins by side, you have a parent project P with sub-projects A and B. And then you state that A depends on P. But P is a aggregate of A and B and hence depends on A. So you already have a circular dependency between A and P. This can never work.
You have to split P in two parts: The part where A depends on (let's call this part A') and the rest (let's call this P_rest). Then you throw away P and make a new project P_rest consisting of A', A and B. And A depends on A'.
In my SBT build, I'm fetching a zip dependency (previously built with the sbt-native-packager plugin), published in my local Ivy repo with a bundle classifier.
But I need the dependency path in the Ivy repo, in order to unzip it (with IO.unzip), put some files in it and repackage it with sbt-native-packager.
I'm using the artifacts(...) method to find the artifact and add it as a dependency :
"foo" % "bar" % "1.0-SNAPSHOT" artifacts(Artifact("bar-bundle", "zip", "zip", "bundle"))
But after that, I'm a bit lost...
I tried to filter out the dependencyClasspath to find it :
val bundleFile = taskKey[File]("bundle's path")
val settings = Seq(bundleFile <<= dependencyClasspath map { _ filter (_.endsWith(".zip"))})
Trouble is : I can't find the zip dependency in any classpath...
What I'm doing wrong ?
I'm using sbt 0.13.
Zip files aren't on the classpath by default. The types of artifacts that are included are configured by classpathTypes. You can add "zip" to it with:
classpathTypes += "zip"
It will then appear on dependencyClasspath.
However, if it isn't really supposed to go on the classpath, you might pull it out of the update report directly.
bundleFile := {
val report: UpdateReport = update.value
val filter = artifactFilter(name = "bar-bundle", extension = "zip")
val all: Seq[File] = report.matching(filter)
all.headOption getOrElse error("Could not find bar-bundle")
}
See the documentation on UpdateReport for details.