I just started to work on a large project that uses SBT for building. I got a new computer with Java 8 installed, but the rest of the team still uses Java 7. That's not a problem as far as the code goes because we're all set to generate v7 byte-code.
The problems arise when attempting to publishing the project using the publishLocal action. Please keep in mind though that I am very new to SBT and some things I say/assume may not be accurate.
We use sbt 0.13. When I run the command sbt publishLocal, it runs the doc action, which in turn runs javadoc to generate the documentation. Since I have java 1.8 installed, it uses the corresponding version of javadoc, which let's be honest is a real pain in the neck, complaining about every single missing #return or #param, self-closing elements (e.g. "<p/>") and such, and returning a non-zero value because of this, thus making the publication fail. However, as I mentioned, the project is fairly big and, although it would be better to complete the javadoc documentation, it's not feasible at the moment.
Luckily, javadoc 8 provides an option do disable the pedantry: -Xdoclint:none will make it quiet about pretty much anything, allowing me to run the publish action by adding it to the javacOptions.
However, as I said, the other team members are still using java 7, and, unfortunately, javadoc 7 does not support that option, so if I push the build.sbt file with this option it will fail on other machines.
So now I'm wondering what I can do. The way I see it, there's a number of options, none of which seems "simple enough" to fix that stupid problem:
downgrade locally to java 7 (not a big fan of having two concurrent versions lying around)
have all other team members upgrade to java 8 (pain for them)
fix all javadoc problems in the whole project (pain for everyone)
Hopefully, there's another option I'm missing that would allow me, for instance, to set the javadoc options based on the java version? Or anything else that doesn't require touching anything else than the build.sbt file...
Thanks!
David
Yes, you can set the Javadoc options based on the Java version:
javacOptions in Compile ++=
sys.props("java.version").split('.') match {
case Array("1", n, _*) if n.toInt <= 7 =>
Seq()
case _ =>
Seq("-Xdoclint:none")
}
Related
If I add the com.google.truth.extensions:truth-proto-extension:1.1 jar to my bazel workspace, it seems to totally nuke the classes from com.google.guava:guava:28.2-jre, resulting in errors like
import static com.google.common.collect.ImmutableMap.toImmutableMap;
^
symbol: static toImmutableMap
location: class ImmutableMap
java/com/google/fhir/protogen/ProtoGenerator.java:316: error: cannot find symbol
.collect(toImmutableMap(def -> def.getId().getValue(), def -> def));
^
symbol: method toImmutableMap((def)->def[...]lue(),(def)->def)
location: class ProtoGenerator
Your documentation says
One warning: Truth depends on the “Android” version of Guava, a subset of the “JRE” version.
If your project uses the JRE version, be aware that your build system might select the Android version instead.
If so, you may see “missing symbol” errors.
The easiest fix is usually to add a direct dependency on the newest JRE version of Guava.
Does this mean anything other than the maven dep on com.google.guava:guava:28.2-jre? If not, what's the next easiest fix?
The key word here is "newest": You'll need to depend on (as of this writing) 30.1-jre. I have edited the docs to emphasize this.
(You can see the newest version in various locations, including: Maven Central, Maven Central Search, the Guava GitHub page.)
The problem is:
Some tools (including Gradle as well as the maven_install rule from Bazel's rules_jvm_external) pick the "newest" version of any given artifact among all versions found in your transitive dependencies.
Truth 1.1 depends on version 30.0-android.
30.0-android is considered to be "newer" than 28.2-jre (because 30 is greater than 28).
The -android releases lack the Java 8 APIs.
(So you can actually fix this by depending on any -jre version from 30.0-jre up: 30.0-jre is considered "newer" than 30.0-android because of alphabetical order. Fun!)
Unfortunately, the Maven ecosystem doesn't support a good way to offer 2 "flavors" of every release (JRE+Android). (People often suggest the Maven "classifier," but that does not actually solve the problem.)
For the future:
Gradle: Gradle is working with us to provide its own solution, but it's not quite ready yet.
Maven: Maven is unlikely to provide help. (It doesn't even try to pick the "newest" version, let alone support "flavors.")
Bazel: I don't know if rules_jvm_external (which uses Coursier) has any plans to support "flavors." (Editorializing a bit: In an ideal world, I would rather specify all my repo's transitive dependencies and their versions myself, rather than having the build system try to work it out for me. That can help avoid surprises like this one. But that brings its own challenges, and we've made only incremental effort toward addressing them in our own Bazel-based projects.)
Hello We are using SBJson lib with version pod 'SBJson', '~>2.2.3' and want migrate to latest version pod 'SBJson5', '~> 5.0.0'.
What will be effort to migrate. I mean will it support old methods or do i need find the deprecated or changed methods or any other specific change.
Does any one used this lib or done migration with SBJson do post to this thread.
Please suggest some solution
I don't know how complicated your usage is, but you will definitively have to do some manual edits. I released version 3, with a major shift in focus towards streaming JSON way back in 2011, and I'm frankly amazed version 2.x.x even compiles today!
I tried to document the changes in the repo's NEWS file. When naming the releases I tried to follow SemVer, which means that every 2.x.x -> 3.x.x and 3.x.x -> 4.x.x and 4.x.x -> 5.x.x were named thus because they all broke backwards compatibility in some way.
For version 3 the API went through a big change to focus on streaming. Much of the API changed to support this. Version 4 further removed old methods that had been obsoleted in the 3.1.x, 3.2.x and 3.3.x series. For 4 -> 5 the breaking change was very small: the API stayed the same, and only the behaviour is slightly different. ("Naked Scalars", e.g. strings and numbers not wrapped it an array or object, is now accepted as per the updated JSON RFC.)
Have you considered whether you actually need to use SBJson? You may be better off just using NSJSONSerialisation. After all it's been in the iOS SDK since iOS 5, and in the Mac SDK since a bit after that. The main benefit of using SBJson is if you want to be able to start parsing JSON before it's fully downloaded. (Thus improving perceived latency.)
Other issues you may run into: I fixed a lot of bugs since 2.x.x releases, which means the parser is much stricter now than it was back then. (Due to bugs it used to allow all sorts of broken JSON, and broken UTF-8.) So if you rely on any of that behaviour you will be out of luck. Hopefully that's not the case :-)
So allegedly, the configuration tool for Qt went through some changes, necessary to be able to make more streamlined Qt builds, a.k.a "Qt Lite". However, there doesn't seem to be any documentation about how to use that feature, or at least I don't find any, and looking at the comments from the release announcement, others can't neither.
What's more, the changes are definitely in there, judging by the fact that the configuration that I've been using for the last couple of years fails in a bunch of ways. I am not sure how up-to-date the built in help is, since the last time I tried using it for guidance, it turned out it was largely outdated and contained options that were no longer supported.
So it would be nice if someone could shed some light on what has changed and how, and how to configure for "Lite" builds. And especially on module and feature dependencies, because I think we'd all like to avoid wasting time on builds that will start building despite an improper configuration that omits necessary dependencies just to have it inevitably fail and result in nothing but a waste of time.
Per to the changelog:
The configuration system has been rewritten almost from scratch. This improved the consistency between builds on Unix and Windows, but some subtle unintended behavior changes are also possible. Also, some obsolete options have been entirely removed and will now cause errors.
It is not permissible any more to manually #define QT_NO_
anywhere. Instead, configure's -no-feature-* options must be used.
Note that this does not apply to defines which modify behavior rather
than entirely removing features.
The -no-feature-* option family was integrated with the rest of the
configuration system. Numerous existing features were made optional,
and build problems in various reduced configurations were fixed.
This is an ongoing effort known as "Qt Lite".
Features for -no-feature-* lists are in qtbase\src\corelib\global\qfeatures.txt.
All features are enabled by default.
More information can be found in the Qt Lite Overview Presentation and its slides.
You can also use the new UI Tool which is known as Qt Configuration Tool and which is a part of Qt for Embedded Devices package - see its documentation. The configuration tool is available for commercial Qt customers only at the moment (Qt 5.8).
The changes that are behind my failed configuration:
there is no longer the option to specify whether sql support is built-in or plug-in, so the format is now just -sql-<driver>, the documentation is still not updated and lists the old format - -<option>-sql-<driver>.
the -l option to add a specific library has been removed, which is turning out to be problematic in multiple areas.
Edit: Also, this blog entry just posted on doing lite builds might be useful.
Everything that describes what the new configuration system understands is given in the configure.json files scattered around Qt modules. The configure tool uses these files to build a list of command line arguments it understands.
Without the use of other tools, to learn about Qt features you need to inspect these json files and choose the features/options you wish turned on or off.
Sub Configurations
These act as includes, and refer to the configure.json file in a given folder. E.g. qtbase/configure.json includes qtbase/src/corelib/configure.json, qtbase/src/network/configure.json etc.:
"subconfigs": [
"src/corelib",
"src/network",
[...]
],
Explicit Command Line Options
The commandline/options value lists the configure options a given Qt module understands. These options are separate from the feature system, although they may be used for convenience to provide shorthand aliases that control features. For example, in qtbase/configure.json, we have:
{ "commandline": { "options": { "accessibility": "boolean", [...] }
This command line option controls the identically named accessibility feature. It is more convenient to use than dealing with the feature system's option [-no]-feature-accessibility. The following pairs have identical effects:
-accessibility or -feature-accessibility
-no-accessibility or -no-feature-accessibility
Values:
boolean options are given to configure as -option and -no-option, meaning true and false, respectively.
all other options are given as -option value.
Feature Options
The features value lists the features available in a given module. The features are effectively booleans. They are all enabled by default, subject to passing configuration tests that enable them.
To control a feature foo:
-no-feature-foo disables the feature. E.g. to disable the iconv feature, you'd do configure -no-feature-iconv [...].
-feature-foo enables the feature and ensures that it is available. This will cause an error if a configuration test for the feature fails. It's useful in build systems that build a particularly configured Qt along with your application: it ensures that the features your code depends on will be available.
Failing Builds
Generally speaking, no matter what combination of feature selections you provide, if configure doesn't fail, the build is supposed to succeed.
we'd all like to avoid wasting time on builds that will start building despite an improper configuration
The configure tool will detect any invalid configurations. If configure succeeds yet the build fails, it's a Qt bug and you should report it.
I've just started using an sbt plugin for packaging JavaFx/ScalaFx applications sbt-javafx. This under Java 7.
While the plugin seems to work pretty well, it is not able to properly package multi-module project. A workaround they have found is to use exportsJars := true in all the modules on which the JavaFX modules depends on.
I also have IntelliJ IDEA that can produce a JavaFX application for me, though that would break the automated build. I'd very much like to have the executable automated.
I need to understand the broad implication of that parameter on my sbt build. Why is the setting needed to be true?
Here is the help definition:
Determines whether the exported classpath for this project contains classes (false) or a packaged jar (true).
It sounds like by default it's false. Why?
p.s. If someone has a cleaner solution to package JavaFX/ScalaFX applications using sbt, please feel free to share.
What are the strategies for versioning of a web application/ website?
I notice that here in the Beta there is an svn revision number in the footer and that's ideal for an application that uses svn over one repository. But what if you use externals or a different source control application that versions separate files?
It seems easy for a Desktop app, but I can't seem to find a suitable way of versioning for an asp.net web application.
NB I'm not sure that I have been totally clear with my question.
What I want to know is how to build and auto increment a version number for an asp.net application.
I'm not interested in how to link it with svn.
I think what you are looking for is something like this: How to auto-increment assembly version using a custom MSBuild task. It's a little old but I think it will work.
For my big apps I just use a incrementing version number id (1.0, 1.1, ...) that i store in a comment of the main file (usually index.php).
For just websites I usually just have a revision number (1,2,3,...).
I have a tendency to stick with basic integers at first (1,2,3), moving onto rational numbers (2.1, 3.13) when things get bigger...
Tried using fruit at one point, that works well for a small office. Oh, the 'banana' release? looks over in the corner "yeah... that's getting pretty old now..."
Unfortunately, confusion started to set in when the development team grew, is it an Orange, or Mandarin, or Tangelo? It looks ok. What do you mean "rotten on the inside?"
... but in all honesty. Setup a separate repository as a master, development goes on in various repositories. For every scheduled release everything is checked into the master repository so that you can quickly roll back when something goes wrong.
(I'm assuming dev/test/production are all separate servers, and dev is never allowed to touch production or the master repository....)
I maintain a system of web applications with various components that live in separate SVN repos. To be able to version track the system as a whole, I have another SVN repo which contains all other repos as external references. It also contains install / setup script(s) to deploy the whole thing. With that setup, the SVN revision number of the "metarepository" could possibly be used for versioning the complete system.
In another case, I include the SVN revision via SVN keywords in a class file that serves no other purpose (to avoid the risk of keyword substitution breaking my code). The class in that file contains a string variable that is manipulated by SVN and parsed by a class method.
An inconvenience with both approaches is that the revision number is not automatically updated by changes in the externals (approach 1) or the rest of the code (approach 2).
During internal development, I'm using milestone numbers (M1, M2, M3...). After release, I'll probably just update dates ("the January 2009 update").