how to set Bazel --warn_duplicate_resources flag in java rules? - jar

I'm trying to figure out how to set a flag in the Bazel BUILD file or .bazelrc file so that I can turn on the --warn_duplicate_resources or --no_duplicates flags when building a Java jar (actually Scala in my case, but the answer should be the same). I want to make it so that if Bazel sees duplicate files when packaging a jar (such as two different logback.xml files) the build will fail instead of picking one file and discarding the other.
I had some issues with shadowing of different resources when building a new jar using an old jar. After rummaging through the internet I found the --warn_duplicate_resources flag in the bazel code base in a few different places as well as a --no_duplicates flag. These seem like they would be helpful making sure to avoid the problem of multiple jar files defining the same resource file.
https://github.com/bazelbuild/bazel/blob/master/src/java_tools/singlejar/java/com/google/devtools/build/singlejar/SingleJar.java
https://github.com/bazelbuild/bazel/blob/master/src/tools/singlejar/output_jar.cc
However, I have no idea what flags to put in the BUILD or .bazelrc files to get the --warn_duplicate_resources or --no_duplicates flags turned on. The java_library rule doesn't seem to have any direct connection and simply adding "build --warn_duplicate_resources" to the .bazelrc caused the build to fail.
https://docs.bazel.build/versions/master/be/java.html#java_library
Any help will be much appreciated, thanks!

Related

Static link with Premake

I'm using premake4 on Linux to build a project which links to a third party .a file.
Neither links {"foo"} nor links {"libfoo.a"} work, since premake generates a build script which incorrectly uses the flag -lfoo as if I'm linking a shared library. Using files {"libfoo.a"} will make premake ignore the file since it isn't C.
Premake4 is getting awfully old at this point. Is switching to Premake5 an option?
If not, one hacky workaround would be to use linkoptions to emit the link flags however you would like them to appear.

How Can I Stop the Enterprise Library Configuration Tool from Inserting an Absolute Path in the Environment Configuration File field?

I'm trying to learn/use the Enterprise Library 5.0 Configuration tool and it seems like it would work perfectly with a few minor exceptions. The problem I am currently having is when it comes to working with different environments. We have 3 environments for one of our web sites, so I can create the 3 different environments within the configuration tool and I can set up the delta files and which properties to overwrite and when.
All is well until I Export Merged Environment Configuration File. When I do this, it creates the file as intended, however it changes the Environment Configuration File field to now include the absolute path.
Also, the delta file now contains a reference to the absolute path.
We use source control (VSTS) - so absolute paths are no good. Our build process consists of creating branches and then merging the code back into a root. We can't have absolute paths when the branches are created by different team members having their code in a different local folder structure.
Is there any way to stop the absolute path from automatically being added? Or any other suggestions?
My research indicates that there does not seem to be a way to make the GUI tool not override the Environment Configuration File value. The solution I am going with is to use the command line tool provided when installing the Enterprise Library. The command line tool is MergeConfiguration.exe.

What use does ./configure serve (other than checking dependencies)

Why does every source package that uses a makefile come with a ./configure script, what does it do? As far as I can tell, it actually generates the makefile?
Is there anything that can't be done in the makefile?
configure is usually a result of the 'autoconf' system. It can generate headers, makefiles, really anything. 'Usually,' since some are hand-crafted.
The reason for these things is to allow source code to be compiled in disparate environments. There are many variations on Unix / Linux, with slightly different system headers and libraries. configure scripts allow code to auto-adapt.
The configure step is a sort of meta build. It generates the makefile that will work on the specific hardware / distribution you're running. For instance it determines the name of the C or C++ compiler and embeds that in the makefile.
The configure step will also frequently take a set of parameters, the values of which may determine what libraries need to be linked against. For instance if you compile Apache HTTP with SSL enabled it needs to link against more shared libraries than if you don't. Since linking is handled by the makefile, you need an extra step to create a custom makefile (rather than requiring the make command to require dozens or hundreds of options.
Everything can be done from within the makefile but some build systems were implemented otherwise.
I don't personally use configure files for my projects but I admit I mostly export Erlang & Python based projects.
I don't think of the makefile as a script, I think of it as an input to the make utility.
The configure script (as its name suggests) configures the makefile, including as you say resolving dependencies.
If only from the idea of avoiding self-modifying code, the things in the configure script don't really belong in the makefile.
the point is that autoconf autohdr automake form an integrated system the makes cross platform building on unix relatively str8forward. THe docs are really bad and there are lots of horrible gotchas but on the other hand there are a lot of working samples
When I first came across this stuff I thought - "ha I can do that with a nice clean makefile" and proceeded to rework the source that way. Big mistake. Learn to write and edit configure.ac and makefile.am files, you will be happy in the end
To answer your question. Configure is good for
is function foo available on this platform and if so which include and library do I need
letting the builder choose if they want feature wizzbang included in a nice simple consistent way

How do you enforce dependencies among java folders in Netbeans?

I am new to Netbeans. I am wondering if someone can help me with project setup in netbeans. I am moving half million lines of Java code from a different IDE to Netbeans. I was able to get the code build and run in Netbeans easily. I have a project with many folders with dependencies among those folders. They have to be built in specific order. This is to enforce layering so that a module in lower layer cannot call into higher layers. I couldn't get that configured in Netbeans. Below is how my project looks like
project/
libA/
libB/
libC/
libD/
libE/
appA/
...
I have one project that builds all the libs and appA. The project build xml is stored under project/ folder. But the libs have dependencies among them. libB should be built after libA. libC after libA. libE depends on libD and libB etc.
I tried to change the order of source folders for libs in project properties. That didn't seem to make any difference. Even if I move libA after libB, it was building everything fine. I expected it to fail because libA didn't build yet.
Iam lost. Just wondering what the trick is to enforce this kind of dependencies. I created my project using "Java project using existing sources" wizard.
I appreciate your help
Thanks
Video guy.
Even though it would be a pain, you could just write your own ant build script and then just have Netbeans use that.
Basically:
write the custom ant build file
install the Ant plugin
create an Ant build file
right click the build file
run the selected target.
This would enable you to enforce whatever you need to do, but, if Netbeans is figuring out the correct order then why not just use it.
Does something break when you just compile and run in Netbeans?
Well! Lets say a team member added piece of code in lower level package that calls into higher layer code. It should fail because it breaks the layering. Because Netbeans seem to compile all the files in one javac invocation, the build compiles just fine. I want Netbeans to break the build in this case.
Writing my own ant script is another way of enforcing it. The whole point in using an IDE is to save yourself from writing your own make files (or ant scripts). This is something any IDE was able to accomplish 10 years back out of the box. I am wondering if I am missing something here.
Thanks
Video Guy

How to get doxygen to run faster?

Doxygen is a bit slow - it takes about a couple of minutes to process my whole project, so for small incremental changes this is longer than actually building the rest of my code. There are thousands of files without any documentation so I guess it is spending most of its time processing them. Is there any way to get it to skip files without any documentation?
What about getting it to only process changed files?
From Doxygen documentation:
How can I exclude all test directories
from my directory tree?
Simply put an exclude pattern like
this in the configuration file:
EXCLUDE_PATTERNS = /test/
So, you should be using patterns to exclude files. It's been a long time since I've used Doxygen, but i don't remember any option to process only changed files.
I found that turning off the option SEARCH_INCLUDES made a big difference. It was looking through the whole platform SDK and include paths for the compiler which were not documented anyway and would not appear in the generated documentation.
There is a DOT_NUM_THREADS options which may increase the performance on multicore machines. Unfortunately doxygen itself is just single threaded.
Another approach would be to organize your code into modules run for each module a separate doxygen instance and link the resulting tags together: http://www.doxygen.nl/manual/external.html
Doxygen is good at finding connections between files, either changed or not. But Doxygen does not remember informations about unchanged files, so it must process the whole codebase each time.
May be a solution would be to organize the project such that never changed files belong to one module which is excluded from Doxygen scope and whose documentation is already available. Then it would be possible to tell Doxygen to link newly built documentation to this existing module documentation.
Going further, it would also be possible to make Doxygen running module by module, processing only changed modules and a top level documentation which links to all module documentations.
I don't think having Doxygen run on a normal dev cycle is a good idea. Our Doxygen build runs as part of our Continuous Integration server's responsibilities.
That said, there are some benefits of running doxygen every build to catch missing docs.
So I would trim the doxygen config for dev builds removing diagrams, and even stop apple importing it into xcode.

Resources