Assume I have a multi-project build with the following dependencies:
Projekt A
Projekt B depends on A
Project C
Project D depends on C
Now I have a change in project C. When I compile/test in sub-project C, I want the commands to be automatically applied to D subsequently (because D relies on C and I may have broken an API), but not to A and B.
Is there a way in sbt to say "compile this sub-project and all projects that depend on it"?
This would be equivalent to buildDependents in Gradle, or --alsoMakeDependents in Maven, I believe.
If it doesn't exist yet, helpful hints on how to implement such a task are also most welcome.
Related
I have a multi-project build that makes extensive use of sub-projects in order to make dependencies between them explicit (by means of dependsOn - the sub-projects act as layers).
If it comes to publishing I want to roll up all the sub-project artifacts into the artifact of the main project (jar), and collect all their libraryDependencies. In other words, I want to publish as if there would be no sub-projects, just a single root project containing all my code (i.e. no ueber-jar). The sub-projects are only there to break up the compilation into smaller scopes with enforced dependency structure, they are not supposed to be distributed separately.
What is the best way to achieve that?
A question about composing sbt projects:
I forked a library project (https://github.com/allenai/pipeline) and wish to hack on it vis-a-vis my project that uses it. If I put them in the same multi-project build ― really my preferred option over publishing locally and consuming ― sbt will not read the project dir of the nested fork (where .scala files and plugin definitions originally reside).
E.g. I try, in build.sbt:
lazy val pipeline = (project in file("pipeline"))
lazy val githubCruncher = (project in file("."))
.dependsOn(pipeline)
And I get errors indicating directory pipeline/project is being ignored; the build.sbt itself of the nested project is being read, but sbt does not find what is defined alongside to it in its sibling project directory (and hence fails to load build definition).
So, must I squash the two projects into a single build.sbt or can I somehow elegantly compose the two without that? I am asking because this has also been a problem on other projects I work on where I ended up squashing each and every build definition minutia into one big monolith ― even though I'd prefer distributing some of the sub-projects as independent projects aside my own dev configuration.
According to my experience looks like everything inside the project directory of a project being pulled into a multi-project build ― needs to move to project under the root of the multi-project build, whereas the build.sbt of the pulled-in project does not have to be squashed ― a reference to it from from the top level build.sbt will work (unless that reference is followed by definitions for it right there in the top level build.sbt).
Sbt plugins coming from the pulled-in project then become applied to the unified multi-project build, and need to be disabled with disablePlugins for other sub-projects, if or where appropriate or desireable.
After having read the chapter on multi-project builds, I've seen this construct:
lazy val util = project
lazy val core = project
I wonder what project is. In the API documentation I can see the Project trait and its companion object, but no entry about project. I've thought it could be a member of trait Build, but it isn't either.
The only thing I know, it returns a Project trait. Where can I find it in the API documentation?
The project is a macro. You can find documentation for it in ProjectExtra, but it's not very verbose.
Basically it will take project's id and directory from the name of the val to which it is assigned.
It may also be used in the form of
lazy val util, core = project
that would give you a quick (and hopefully easy) way to lay out a multi-project build definition.
Moreover, the project macro applies as much to single-project builds as to multi-project ones. It has just appeared more often in multi-project builds since there's much more to configure and build.sbt's alone are usually not enough.
All:
I need advise on how to use Nuget to make my project dependencies (libraries) available to other developers who will in turn have my project as dependency. See scenario below for details:
I have created a Visual Studio 2013 project (ProjA) in a solution (SolA) which has a dependency on a Library (LibA [which I do not commit into source control]). I have used Nuget to manage/fetch the dependencies of project ProjA (i.e. library LibA) via Nuget.Config in .nuget folder at solution SolA level and everything is working ok. Developers are able to checkout solution SolA and build/deploy with Nuget fetching LibA from a local server.
My issue is that I now need to have developers build their project (ProjB) in another solution (SolB) but which will import/use ProjA as a dependent project. Issue is that I cannot find a way to make Nuget fetch the dependencies of ProjA (i.e. LibA) when built as part of solution SolB. I tried putting the Nuget.Config File in the level of ProjA, but VS build seems to ignore it.
Any ideas????
You seem to be mixing two different but not-very-compatible approaches to code sharing here:
Code-level dependencies
Package-level dependencies
Code-level dependencies between different solutions are generally A Bad Thing, and you should avoid them. A solution should encapsulate and build all the source code it needs to, relying on 'library' DLLs (whether provided as raw DLLs or via NuGet).
I recommend that you re-work your solutions using the 'Package-level dependency' pattern, so that you have a separate 'library' solution which provides a NuGet package (or set of NuGet packages) which the other two solutions can consume:
Here is the current (awkward) dependency graph:
Solution A Solution B
Proj A -----------> Proj B
^--------------------'
Here is what I propose with the separate library solution:
+----> Solution L <----+
| |
Solution A Solution B
Solution A and Solution B thus consume the NuGet packages produced by Solution L (the library project). This is the dependency relationship which probably underlies your code anyhow, based on what you describe.
I keep seeing sbt projects with the following resolvers setting in their build definition:
resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
It's also described in the official documentation of sbt in Resolvers section.
My limited knowledge of using sbt and build management tools in general lets me however consider it as a kind of anti-pattern.
The reason is that if a project declares a dependency on a library in the Local Maven Repository it somehow got downloaded there in the first place so it's available somewhere outside the local maven repository. If it is, why not using the original repository as the source?
If the dependency is not in a public repository, and is a dependency of a project, the project can use dependsOn to declare it without the additional repository in resolvers.
Please advice as I may be missing something obvious that makes the resolvers setting indispensable.
One obvious reason would be if one of your dependencies is a local project built with maven.
One scenario:
You have a project x which you build with sbt
X depends on y. Y is built with maven.
There is a bug in y which you need to fix/test, and you want to regression test x before you check it in.
You build a snapshot of y, and then you can test x before you commit the change to y.