What's the project keyword in build definitions? - sbt

After having read the chapter on multi-project builds, I've seen this construct:
lazy val util = project
lazy val core = project
I wonder what project is. In the API documentation I can see the Project trait and its companion object, but no entry about project. I've thought it could be a member of trait Build, but it isn't either.
The only thing I know, it returns a Project trait. Where can I find it in the API documentation?

The project is a macro. You can find documentation for it in ProjectExtra, but it's not very verbose.
Basically it will take project's id and directory from the name of the val to which it is assigned.
It may also be used in the form of
lazy val util, core = project
that would give you a quick (and hopefully easy) way to lay out a multi-project build definition.
Moreover, the project macro applies as much to single-project builds as to multi-project ones. It has just appeared more often in multi-project builds since there's much more to configure and build.sbt's alone are usually not enough.

Related

Is it possible to manage a hierarchical product structure in SBT which has more than just one level?

We have a multi module project consisting of two modules, modA and modB.
modA depends on modB.
modB in turn depends on a list of libraries (libA and libB) where we also have the source code. This sources have already been adapted by us.
At last, libB and libC are independend from each other, but depend on a third library, libC.
What I want to have is a setup, where the three libraries (which are in principle also a multi-module SBT project) can just be "included" in the top level project.
The point here is also that these libraries can be re-used for other projects, too, so the changed sources should not belong to this super project only.
Currently I tried to solve it by including the library as GIT submodule.
Unfortunately SBT does not (seem to) support hierarchical sub modules, so I cannot really just have a second, also multi-module SBT file for all libraries which just gets included in the "super-super" project.
This current setup is clearly not the SBT way.
What is the intended method of solving this?
Just adapting the library separately and re-using it just as JAR file in the super project is possible, but clumsy, because the using project(s) are the main reason to hack the library, so it would be nice if this works in a smooth way.

Can sbt dynamically use a generated subproject's build.sbt

My main project needs to generate a sub-project with its own build.sbt and then compile and use the sub-project. If the sub-project was pre-generated, I could reference it from the main build.sbt with RootProject. But if the sub-project is not yet generated, any attempt to use a value of lazy val sub = RootProject(subBaseDir) fails.
Is it possible to load the sub-project that does not exist at the moment of sbt start, so some tasks of the main project depended of the sub-project?
Yes, it should be possible in SBT 0.13.13 using its new feature called "Synthetic subprojects". You won't have a build.sbt file for such synthetic projects, since they're, well, completely synthetic. But otherwise they're completely functional projects, and you should be able to set up the dependencies between tasks of current project and its derived projects.
Disclaimer: I haven't tried this new feature myself yet.

How do you compose sbt projects into a single build?

A question about composing sbt projects:
I forked a library project (https://github.com/allenai/pipeline) and wish to hack on it vis-a-vis my project that uses it. If I put them in the same multi-project build ― really my preferred option over publishing locally and consuming ― sbt will not read the project dir of the nested fork (where .scala files and plugin definitions originally reside).
E.g. I try, in build.sbt:
lazy val pipeline = (project in file("pipeline"))
lazy val githubCruncher = (project in file("."))
.dependsOn(pipeline)
And I get errors indicating directory pipeline/project is being ignored; the build.sbt itself of the nested project is being read, but sbt does not find what is defined alongside to it in its sibling project directory (and hence fails to load build definition).
So, must I squash the two projects into a single build.sbt or can I somehow elegantly compose the two without that? I am asking because this has also been a problem on other projects I work on where I ended up squashing each and every build definition minutia into one big monolith ― even though I'd prefer distributing some of the sub-projects as independent projects aside my own dev configuration.
According to my experience looks like everything inside the project directory of a project being pulled into a multi-project build ― needs to move to project under the root of the multi-project build, whereas the build.sbt of the pulled-in project does not have to be squashed ― a reference to it from from the top level build.sbt will work (unless that reference is followed by definitions for it right there in the top level build.sbt).
Sbt plugins coming from the pulled-in project then become applied to the unified multi-project build, and need to be disabled with disablePlugins for other sub-projects, if or where appropriate or desireable.

What is the reason to add Local Maven Repository to sbt?

I keep seeing sbt projects with the following resolvers setting in their build definition:
resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
It's also described in the official documentation of sbt in Resolvers section.
My limited knowledge of using sbt and build management tools in general lets me however consider it as a kind of anti-pattern.
The reason is that if a project declares a dependency on a library in the Local Maven Repository it somehow got downloaded there in the first place so it's available somewhere outside the local maven repository. If it is, why not using the original repository as the source?
If the dependency is not in a public repository, and is a dependency of a project, the project can use dependsOn to declare it without the additional repository in resolvers.
Please advice as I may be missing something obvious that makes the resolvers setting indispensable.
One obvious reason would be if one of your dependencies is a local project built with maven.
One scenario:
You have a project x which you build with sbt
X depends on y. Y is built with maven.
There is a bug in y which you need to fix/test, and you want to regression test x before you check it in.
You build a snapshot of y, and then you can test x before you commit the change to y.

RootProject and ProjectRef

I have been trying to find more information on RootProject and ProjectRef, but looks like it is not mentioned at all in sbt documentation.
I understand that if you are referencing a root project you should use RootProject and ProjectRef when you are referencing a sub-project. However it is not clear how the behavior will be different between them. Can somebody please help explain?
Also the fact that it is not documented, does it mean that RootProject and ProjectRef are not the recommended way to reference other sbt projects?
Thanks.
A single sbt build has a single project/ directory for .scala build definitions and plugin definitions. There can be multiple subprojects within that build with their own .sbt files, but not their own project/*.scala files.
When you want to include other, separate builds directly instead of using their published binaries, you use "source dependencies". This is what RootProject and ProjectRef declare. ProjectRef is the most general: you specify the location of the build (a URI) and the ID of the project in the build (a String) that you want to depend on. RootProject is a convenience that selects the root project for the build at the URI you specify.
Source dependencies do have an overhead: startup time, memory usage, and command line usability. If the group of projects don't need to be separate, it is best to use a single build with standard subprojects.

Resources