how to avoid rebuilding the resolvers in sbt 0.13.x? - sbt

We are using maker to build a very large scala project. It takes about 3 minutes to compile an 18 layer project (about 30 - 40 modules).
I was interested in comparing the performance with a more recent SBT and I created a quick and dirty build file for sbt 0.12.4. Despite SBT using parallel compilation, it took 10 minutes to compile the same project from clean (not counting ivy download time).
The console output seemed to be preoccupied with resolving the dependencies, which were all in my ivy cache. I stumbled upon Why sbt runs dependency resolution every time after clean? and it provides a hack that at least speeds up the second build to 3.5 minutes. However, this hack does not work on 0.13.x.
What is the equivalent hack in 0.13?
Is there anything else can be done to speed up an sbt compile?
Although this is not the project, assume for all intents and purposes that my build script looks something like build.scala (this file was the template that I used and added the hack around line 54).

Here you go, add this. Works for me on 0.13.1.
cleanKeepFiles ++= Seq("resolution-cache", "streams").map(target.value / _)

Related

Flow takes very long to start up because it checks node_modules

I added Flow to my React/Redux/Webpack project and initially, it was great, I loved the type annotations. Over time, I noticed that the start up time of the project became really slow - around 120s; it used to be around 20s. Upon investigation, I realized that it was because Flow was scanning through all the JS files in node_modules.
I attempted to ignore node_modules in .flowconfig by adding:
[ignore]
.*node_modules/.*
The start up became fast again but Flow would complain Required module not found in places where I import external libraries from my code.
A workaround suggested in this Github issue was to flowignore the node_modules and manually add interfaces for external libraries. This seems to work but is a hassle to maintain when new libraries are added into the project.
It's frustrating to have to wait almost 2 minutes each time I start the project, are there any better ideas?
One way to avoid the Required module not found errors would be to run flow-typed install which will fetch existing libdefs for popular libraries from the flow-typed repository. It also generates stubs for the libraries that cannot be found in it. This works great for many projects but in some rare cases, the stubs for certain libraries like Immutable.js were not generated.
I eventually came up with cli command flow-scripts to automatically generate the libdef stub interfaces such that I could flowignore node_modules but not get the Required module not found errors.
Simply run
$ flow-scripts stub
in the root of the repository and the libdef stubs will be automatically generated in flow-typed directory.

Enforce Javadoc version in SBT

I just started to work on a large project that uses SBT for building. I got a new computer with Java 8 installed, but the rest of the team still uses Java 7. That's not a problem as far as the code goes because we're all set to generate v7 byte-code.
The problems arise when attempting to publishing the project using the publishLocal action. Please keep in mind though that I am very new to SBT and some things I say/assume may not be accurate.
We use sbt 0.13. When I run the command sbt publishLocal, it runs the doc action, which in turn runs javadoc to generate the documentation. Since I have java 1.8 installed, it uses the corresponding version of javadoc, which let's be honest is a real pain in the neck, complaining about every single missing #return or #param, self-closing elements (e.g. "<p/>") and such, and returning a non-zero value because of this, thus making the publication fail. However, as I mentioned, the project is fairly big and, although it would be better to complete the javadoc documentation, it's not feasible at the moment.
Luckily, javadoc 8 provides an option do disable the pedantry: -Xdoclint:none will make it quiet about pretty much anything, allowing me to run the publish action by adding it to the javacOptions.
However, as I said, the other team members are still using java 7, and, unfortunately, javadoc 7 does not support that option, so if I push the build.sbt file with this option it will fail on other machines.
So now I'm wondering what I can do. The way I see it, there's a number of options, none of which seems "simple enough" to fix that stupid problem:
downgrade locally to java 7 (not a big fan of having two concurrent versions lying around)
have all other team members upgrade to java 8 (pain for them)
fix all javadoc problems in the whole project (pain for everyone)
Hopefully, there's another option I'm missing that would allow me, for instance, to set the javadoc options based on the java version? Or anything else that doesn't require touching anything else than the build.sbt file...
Thanks!
David
Yes, you can set the Javadoc options based on the Java version:
javacOptions in Compile ++=
sys.props("java.version").split('.') match {
case Array("1", n, _*) if n.toInt <= 7 =>
Seq()
case _ =>
Seq("-Xdoclint:none")
}

SBT code generation and fork

I'm trying to generate code in SBT build for Slick like in the example.
However if I have
fork:=true
setting in the project - build fails with:
java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
I want to keep that option to prevent memory leaks in my unit tests. If I get it right - there is no scala-reflect.jar loaded in forked jvm. But I have no idea how to load it.
I had a wrong assumption that SBT always loads all scala libraries by default, what appears to be incorrect for forked run. If I add scala-reflect dependency explicitly like Seth suggested it works ok.

Creating ant build script to build only when a dependency was updated

I just started working with ant a few days ago. Right now I have a general buildall.xml which should call each project's build.xml. Because some projects depend on each other, I need to rebuild some other projects which depend on it. This isn't a problem--I'm just setting the depends property of the target. However, ant is always building the dependencies, even when the files haven't changed.
Let's say project1 has no dependencies; project2 depends on project1; project3 depends on project1, 2; project4 depends on project1, 2, and 3; and so on.
I could hack a solution which looks at project K, and checks if project 1 .. project K have updated files using uptodate. If so, then run the target. This is messy and appears unnecessary.
What is the cleanest way to implement this?
EDIT: So I decided to just hack in a bunch of targets, "check_projectK" where it does the uptodate checks on all of its source files, its build file, and the build files of the 1 .. K-1 projects. Due to dependencies, this is always handled correctly. However, this is still a large amount of copy and paste for a large workspace. I will leave this open.
Short answer, ANT can't do it, not unless you have some kind of way to connect to your version control system and check if anything has changed (you are using source control right?). Ant doesn't know about when what the last time a file changed and then see if it matches with what was built; it doesn't have the concept of a dependency repository. The whole purpose of Ant is that it just builds.
The solution to your problem isn't Ant, it's Maven. Maven HAS a dependency repository. There's also a very nifty plugin for Maven used specifically with Flex appropriately called FlexMojos. By using this, Maven can know when something was last built because it's uploaded to the repository. Then your other projects can add it's dependencies and download the SWC needed.
On top of that, it mixes great with a continuous integration engine like Hudson, Bamboo and Teamcity, which builds a project every time a file has been committed to your source control system, and then updates all dependent projects automatically!

Compiling Flex Modules - Speed up link-checking?

I'm working on a Flex project that has 28 modules, and 1 main SWF. Compiling everything takes 18 minutes total. I'm using load-externs to load a link-report from my main SWF - that works great. The file size of the modules is minimal. But link-checking still occurs for each individual module, increasing the compile time dramatically.
So say I edit a file in my main SWF that a module uses. mxmlc basically performs link-checking for the same file in that module. Even when -incremental=true. When I edit a file that ALL of the modules reference (which happens frequently), ALL of the modules perform link-checking. This has the effect of basically compiling the main SWF 28 times.
This is frustrating, because link checking occurs when I compile my first SWF. It should not have to reoccur for every module. I tried using fcsh, which would hopefully store these links in memory, but that had no effect.
Maybe it would help to compile a SWC of my main SWF, and use that for link-checking instead?
Here are the commands I use to build:
mxmlc -link-report=report.xml -strict=true -debug=false -optimize=true -incremental=true Project.mxml
mxmlc -load-externs=report.xml -strict=true -debug=false -optimize=true -incremental=true ModuleXX.mxml # 28 times
I haven't found a solution for this problem and it's hindering the development of my project. Any help would be greatly appreciated.
Thank you!
Jimmie
I agree that 18 minutes is a very long build time, but this kind of module size optimization should only be necessary on a release build. So my simple recommendations are
Don't use compiler optimization during development time
Only recompile the modules you are working on
Upgrade your hardware
Upgrade your software (the Flex 4 compiler should be faster than Flex 3)

Resources