Makefile phony "clean" target executes the include directive - gnu-make

I highlight the relevant parts in of my Makefile here:
...
$(DEP_FILES): dep/%.d: src/%.cc
some command
include $(DEP_FILES)
.PHONY: clean
clean:
...
when I run make clean the include directive is executed which executes the rules in the dependency .d files.
This generates errors if there are errors in the dependency files.
Why is the include directive executed when calling make clean?
clean should only be for cleaning. How to avoid this situation?

Related

Disable parallel execution in make

I have a building process that creates a header file. In the second stage, several source files are generated from that header file. Then these source files are built into a binary. If anyone is interested these sources are generated with gSOAP utilities (wsdl2h, soapcpp2).
I have made Makefile.am, etc to build these sources, but there are problems when I want to use parallel execution.
Makefile.am would look something like this in a very simplified form
## generate header file
service.h : service.wsdl
wsdl2h -o $# service.wsdl
## list of generated source files
generated_files = source1.cpp source2.cpp source3.cpp
## generate source files
$(generated_files) : service.h
soapcpp2 $^
## build binary
binary: $(generated_files)
gcc -o $# $^
The rules say that service.h will be generated if service.wsdl changes. If service.h changes, soapcpp2 will generate source?.cpp files with one command execution.
Everything works fine until I try to build in parallel (for instance make -j4). The problematic line is the last one which generates many soruce files. If running in parrallel all these files are generated many times, while other make processes already try to compile them.
I followed instructions to disable parallel https://www.gnu.org/software/make/manual/html_node/Parallel-Disable.html, but with no success. If I try
.NOTPARALLEL: $(generated_files)
or
.NOTPARALLEL: service.h
The parallel execution just does not work any more. I also tried with .WAIT, and got no rule to make target .WAIT.
First, the .WAIT special target was introduced in GNU make 4.4. Since you are getting a no rule to make target error for it, it's clear you're using an older version which doesn't support it. It's usually a good idea to include the version of whatever tool you're using when asking for help.
The best thing to do is not disable parallelism but instead tell make that a single invocation of the recipe will generate all the files. If you have GNU make 4.3, then you can use a "grouped target" rule, like this:
## generate source files
$(generated_files) &: service.h
soapcpp2 $^
the &: here tells make that instead of building each target with a different invocation of the recipe, which is the default, a single invocation of the recipe builds all the targets.
If you don't have GNU make 4.3 then you'll need to play a trick to get the same behavior, something like this:
## generate source files
.sentinel : service.h
soapcpp2 $^
#touch $#
$(generated_files) : .sentinel ;
## build binary
binary: $(generated_files)
gcc -o $# $^
This has all the generated files depend on a single file .sentinel (you can name it whatever you want), which is the one make knows is generated by the recipe that also creates all the other source files. This isn't perfect but it will work for simple situations.

GNU Make: Automatically Prerequisites can't work if rename header files

A common Makefile for automatically prereq, looks like:
SRCS := $(wildcard *.c)
OBJS := $(SRCS:%.c=%.o)
DEPS := $(OBJS:%.o=%.d)
$(OBJS): %.o: %.c
$(CC) $(CFLAGS) -c -o $# $<
include $(DEPS)
$(DEPS): %.d: %.c
xxx
the first time, build ok, the generated .d file like this:
config.o config.d: config.c config.h
then I rename config.h to config2.h, and modify config.c:
-#include "config.h"
+#include "config2.h"
make again, Makefile generate error:
make[1]: *** No rule to make target 'config.h', needed by 'config.d'
because config.d depends config.h, How can I modify my Makefile to fix this rename problem.
Pretty simple really. Your .d file needs this additional line:
config.h:
Now when make discovers config.h doesn't exist,
it will run the non-existent recipe and happily believe it has created config.h. Then it carries on.
The manual says:
If a rule has no prerequisites or recipe, and the target of the rule is a nonexistent file, then make imagines this target to have been updated whenever its rule is run.
How to we get this extra line?
Back in the day you would run a perl one-liner over the newly created .d file. Nowadays, for modern gcc variants, just add -MP to the compiler command-line.
-MP This option instructs CPP to add a phony target for each dependency other than the main file, causing each to depend on nothing. These dummy rules work around errors make gives if you remove header files without updating the Makefile to match.
Job's a good 'un.

Making multiple files from multiple files with one command in gnu make

Assume 1000 files with extension .xhtml are in directory input, and that a certain subset of those files (with output paths in $(FILES), say) need to be transformed via xslt to files with the same name in directory output. A simple make rule would be:
$(FILES): output/%.xhtml : input/%.xhtml
saxon s:$< o:$# foo.xslt
This works, of course, doing the transform one file at a time. The problem is that I want to use saxon's batch processing to do all the files at one time, since, given the number of files, that would be much faster, considering the overhead of loading java and saxon for each file. Saxon allows the -s (source) option to be a directory and processes all files in that directory, placing the results with the same name in the directory specified in the -o: option.
I'm aware of the well-known technique to get GNU make to do a single command to update multiple files by using pattern rules:
output/%.xhtml: input/%.xhtml
saxon s:input -o:output foo.xslt
But in my case this suffers from two problems. First, it will run the transform on all files in the input directory, not just the ones that have changed; and second, it will not limit the transform to the subset of files specified in $(FILES). The GNU make feature of running a recipe given in a pattern rule only once for all matched targets does not work in the case of so-called "static pattern rules" (see [here]), as the rule given at the top of the post is known.
In order to use the saxon batching feature, I need to create a temporary directory, copy to it only those files to be processed, then run the transform with that temporary directory as the input directory. I tried creating a temporary directory, and remember its name using a target-specific variable for future use, using
$(FILES): TMPDIR:=$(shell mktemp -d)
but this creates a new temporary directory for every single target that is out-of-date. In any case, I'm not sure how to structure the rule that would then copy the necessary files into that directory. I don't want to create the temporary directory at the time the makefile is parsed, since I have a non-recursive make system that will parse all make files, even those not related to the current top-level target, and don't want to create the temporary directory for situations in which it is not necessary/will not be used.
I'm well aware that many questions have been asked on SO in the past about creating multiple files from a single input; one solution is (non-static) pattern rules; other solutions involve phony targets. However, in this case I'm stuck as to how to put all this together.
I can identify the files that changed and copy them using the static pattern rule
$(FILES): output/%.xhtml : input/%.xhtml
TMPDIR=`mktemp -d`
cp $< $(TMPDIR)
but actually I would prefer to copy the files with a single cp command, whereas this copies them one by one. Perhaps there is some application here of cp -u?
I also considered using an ad-hoc extension for those files needing updating but could not see how to get this to work either. I'm about to give up and just run the saxon transform on all files when any of them have changed, but is there any better way?
Personally, I wouldn't try to do this from the command line. That's partly because I'm not a shell scripting wizard. I'm not an Ant wizard either, but because the requirement is to process files that haven't changed, this seems to fall very much into Ant territory. On the other hand, Ant will recompile the stylesheet for each transformation, which is an overhead you might want to avoid; if that's the case then your best bet is probably to write a little Java application. It's probably only 100 lines or less.
Final possibility is to do the processing within Saxon: that is, a single transformation that reads multiple input files using the collection() function and generates multiple result files using xsl:result-document. Saxon (commercial editions) offers an extension function last-modified that allows you to filter the files to be processed. With 1000 files you might also want the extension function saxon:discard-document() to prevent the heap filling.
Personally, I like your original one-compiler-per-file formulation. Does not this work well with make's -j n flag?
You can of course batch up files by copying, and then running saxon at the end. Recursive make (ugh!) can sort out the ordering. Something like:
.PHONY: all
all:
rm -rf tmpdir
${MAKE} tmpdir/sentinel
saxon -s:tmpdir -o:output foo.xslt
tmpdir/sentinel: $(FILES) ; touch $#
$(FILES): output/%.xhtml: input/%.xhtml
ln $< $(patsubst input/%,tmpdir/%,$<)
This does work, though I am very queasy about lying to make (the static pattern rule purports to create the target in output/, but in fact does its dirty deed in tmpdir/).
Note in the recipe for tmpdir/sentinel, that $? is correctly set to the list of output files that are out of date. This might be useful if you can pass a bunch of files to saxon rather than a folder.
I think one issue here is that 'saxon' supports either one file or all files in a directory, so isn't suitable for batch processing without copying to temporary directories.
Otherwise, this is quite simple to do by using a timestamp marker file as a proxy target. For example:
output/.timestamp : $(FILES)
mkdir -p $(#D)
$(COMMAND) -outputdir=output $?
touch $#
The three commands are:
Ensure that the output directory exists.
Run the batch command on files newer than the timestamp file.
Update the timestamp file (creating it if necessary).
Remembering that each line of a command is executed in its own subshell, and that if any command line fails, then subsequent lines are not invoked.
This approach is useful with Java builds.

Overriding build rules in make

I'm using a Makefile to build an embedded project. I've inherited the project from numerous previous developers who haven't been using Make to its full potential, and I'd like to be able to specify the project version in the makefile using defines on the build command. However, there's already a build rule that builds all the object (.o) files. Is there any way to override that build rule for a specific object file so that I can add -D flags to the compiler ?
Another reason I'd like to be able to specify the project version in the makefile is so that I can have it generate artifacts with the build version in the names of the resulting files produced by the build process.
Yes, you can override a pattern rule (which is what I bet your .o rule is), just by having a specific rule (and the order of the rules doesn't matter):
%.o:
do_generic_things
x.o:
do_specific_things -Dproject_version
Yes, you can put a build version in a file name. There's more than one way to do it-- the best is probably to put it in the target name:
%$(B_VERSION).o: %.c
$(CC) -c -DBUILD_VERSION=$(B_VERSION) -Whatever $&lt -o $#
If you are using GNU make and you only want to change compiler options, you can use target-specific variables, like so:
x.o: CFLAGS += -DEXTRA_SYMBOL_FOR_X
This also works recursively, i.e. the target-specific value for x.o also is in effect for all targets which x.o depends on, meaning that if you build multiple executables in your makefile, you can set a target-specific variable on the executable itself, which will be in effect for all the object files:
foo: CFLAGS += -DEXTRA_SYMBOL_FOR_FOO_APP

Qt qmake - how to stop it adding rules to delete the target

I am trying to add a unit test to a group of other tests. All the tests are in their own subdirectories, each with it's own .pro file and the .cpp file which contains the tests themselves. Running qmake in one of the subdirectories creates a Makefile, and then running make runs the compiler to make the TARGET. The tests are actually run by the 'check' target - ie with 'make check'.
The test I'm trying to add is different, but it is trying to pretend to behave the same way.
It is different because it is a perl script and so doesn't need to be compiled. It does, however, need to be run - so 'make check' needs to work.
I had a .pro file working for the most part - 'qmake', 'make', 'make check', and 'make clean' would work, but 'make distclean' removed my script (since it assumes it can be regenerated by compiling something).
So, the question is, how do I stop it from removing my script?
Perhaps there's some other approach I should be taking. I had tried the 'subdirs' TEMPLATE, but that does more than just remove the line in Makefile that deletes the TARGET.
Ideas?
Using Ubuntu Linux with Qt 4.6.0.
I would look into the custom target capabilities for your script. Maybe something like this:
check.commands = <scriptname>
check.depends = <any dependencies>
QMAKE_EXTRA_TARGETS += check
Doing things this way will run the check command when the dependencies change, but as long as you don't specify check.target then it shouldn't remove anything. (If your script does produce output, then perhaps that should be in check.target.) Also, since it is specified as an "extra" command, qmake shouldn't create commands to delete your script in a distclean.
This is assuming that your script is in its own subdirectory (which you state), and is the only "check" command that needs run in that subdirectory (kind of implied by the question, but not directly stated).

Resources