Nmake getting a list of all .o files from .cpp files - wildcard

I'm using nmake to compile multiple source files into an elf. However I do not want to specify the .o files in a long list like this:
OBJS = file1.o file2.o file3.o
What I would prefer is to use a wildcard that specifies all .o files in the current directory as dependencies for the .elf. However, the .o files don't exist until I've compiled them from the .cpp files. Is there any way to get a list of cpp files using wildcard expansion and then do a string replacement to replace the .cpp with .o.

There's not a particularly elegant way to do this in NMAKE. If you can, you should use GNU Make instead, which is available on Windows and makes many tasks much easier.
If you must use NMAKE, then you must use recursive make in order to do this automatically, because NMAKE only expands wildcards in prerequisites lists. I demonstrated how to do this in response to another similar question here.
Hope that helps.

I'm more familiar with Unix make and gmake, but you could possibly use:
OBJS = $(SOURCES:.cpp=.o)
(assuming your source files could be listed in SOURCES)

Here is another answer that might help you.
Another solution may be to use a wrapper batch file, where you create a list of all .cpp files with a "for" loop, like
del listoffiles.txt
echo SOURCES= \ >> listoffiles.txt
for %i in (*.dll) do #echo %i \ >>listoffiles.txt
echo. >> listoffiles.txt
Afterwards, you can try to use this with the !INCLUDE preprocessor macro in nmake:
!INCLUDE listoffiles.txt
(I am sure this won't work from scratch, but the general idea should be clear).

Related

Adding custom commands to existing targets in qmake

Is there a way to specify, in a .pro file, extra commands to be added to a standard target in the Makefile that qmake generates? For example, consider distclean, extra commands might be desired to:
Remove *~ files.
Clean out runtime-generated output files from the source tree.
Etc.
I want to use the normal target and not a custom target because I want this to be completely transparent in my workflow. That is (again using distclean as an example), I don't want to...
... require knowledge in a multi-project setup that certain Makefiles use a custom rule instead of distclean.
... document custom rules, even for stand-alone projects, as distclean is already well-known and intuitive†.
I found How to add custom targets in a qmake generated Makefile?, but this describes adding custom targets (which is already documented, even back in 4.6) rather than adding rules to existing targets. While it does contain some hints, all of them require adding new custom targets, as specifying the same target more than once in a Makefile replaces (not adds) commands from the previous target.
The only thing I could really think of to try was to add target.commands += new commands to the .pro file as a wild guess (e.g distclean.commands += rm \"*~\"). This has no effect.
How can I transparently add custom commands to existing targets with qmake?
† For the distclean example: While maintainer-clean is also on that "standard target" list, in practice I have found it to be rarely used, and in any case qmake doesn't generate it by default; I consider it to be unsuitable.
There are two straightforward ways to accomplish this, depending on how self-contained / portable you want your solution to be and how lenient you want to be with the order of command execution.
Option 1
The first option is to create a custom target in the .pro file for the new commands, then add that target as a prerequisite to the standard target that you are modifying. Going back to the distclean example, let's say you want to add a command to remove all *~ files:
Create a custom target in your .pro file. Note that you have to escape quotes and slashes in .pro files. For example, add:
extraclean.commands = find . -name \"*~\" -exec rm -v {} \\;
Add this target as a dependency of the target you are modifying:
distclean.depends = extraclean
This won't actually modify the distclean rule just yet, as this method can't be used to modify existing rules. However...
Add both your new target and the target you are modifying as extra targets:
QMAKE_EXTRA_TARGETS += distclean extraclean
This will add a second specification of distclean to the Makefile, but this works because you can add dependencies to existing targets in make in separate rules, even though you can't add commands that way. If you were to also specify distclean.commands in your .pro file, you would break the existing distclean by replacing its default recipe.
So, putting that all together, in the .pro file:
extraclean.commands = find . -name \"*~\" -exec rm -v {} \\;
distclean.depends = extraclean
QMAKE_EXTRA_TARGETS += distclean extraclean
Where extraclean is some custom target with the commands you want to add, and distclean is the existing target that you wish to modify.
Pros:
Completely self-contained in a .pro file.
As portable as you can get, leaves the actual Makefile syntax and generation up to qmake.
Cons:
Your new commands aren't appended to the existing recipe. Rather, they happen after all prerequisite targets are satisfied but before the existing recipe. In the distclean example, with the version of qmake that I'm using, this places the commands after the source tree clean but before Makefile itself is deleted (which is the only action the default recipe takes). This is not an issue for this example, but may be an issue for you.
Option 2
The second option is to change the name of the Makefile that qmake generates, and create your own custom Makefile that defers to the generated one, rather than includes + overrides it. This is also a straightforward option; while not as self-contained as option 1, it gives you the ability to execute commands both before and after the default generated recipe.
You don't want to include + override the existing Makefile, because you don't want to replace the default recipes. If you do, you have to re-implement the default, but this can be an issue as that default may change (and you have to keep up with the changes). It's best to let qmake do as much work as possible, and not repeat its work.
To do this:
First, change the name of the file that qmake generates. This can be accomplished by adding a line such as this to the .pro file:
MAKEFILE = RealMakefile
That will cause qmake to output RealMakefile instead of Makefile.
The next step is to create your own Makefile with your custom commands. However, there are some caveats here. First, a full example, again using distclean. In a file named Makefile:
.DEFAULT_GOAL := all
%:
#$(MAKE) -f RealMakefile $#
distclean:
#$(MAKE) -f RealMakefile $#
#find . -name "*~" -exec rm -v {} \;
Some notes about this:
We set .DEFAULT_GOAL because otherwise distclean would be the default. An alternative to this, if you're not comfortable with .DEFAULT_GOAL, is to specify an all rule using #$(MAKE) -f RealMakefile $# as the recipe.
The % target matches any target that isn't otherwise defined in this Makefile. It simply delegates processing to RealMakefile.
The distclean target is where we add our commands. We still need to delegate to RealMakefile, but additional commands can be added both before and after that happens.
Pros:
More control over command order. Commands can be added both before and after the default recipe.
Cons:
Not self-contained in a .pro.
Not as portable: It doesn't leave all Makefile generation up to qmake, and also I'm not actually sure what parts are specific to GNU make here (comments welcome).
So, while this answer may be a little long, both of these methods are very straightforward. I would prefer option 1 unless the command execution order is an issue.
Another solution is to add files you want to delete to the QMAKE_CLEAN and QMAKE_DISTCLEAN qmake variables.
build_tests {
TINYORM_SQLITE_DATABASE = $$quote($$TINYORM_BUILD_TREE/tests/q_tinyorm_test_1.sqlite3)
QMAKE_CLEAN = $$TINYORM_SQLITE_DATABASE
QMAKE_DISTCLEAN = $$TINYORM_SQLITE_DATABASE
}
It is relevant only, when do you know files you want to delete, so in this case, you can not use rm command or some sort of globbing.

How to generate translation file (.po, .xliff, .yml,...) from a Symfony2/Silex project?

Im going to build a Silex/Symfony2 project and I have been looking around for a method to generate XLIFF/PO/YAML translation files based on texts-to-be-translated inside the project but not found any instruction or documentation on it.
My question is: Is there an automated way to generate translation file(s) in specific format for a Symfony2/Silex project?
If yes, please tell me how to generate the file then update the translation after that.
If no, please tell me how to create translation file(s) then adding up more text for my project? I am looking for an editor desktop based or web-based instead of using normal editor such as Transifex, GetLocalization (but they dont have option to create a new file or add more text)
After a long time searching the internet, I found a good one:
https://github.com/schmittjoh/JMSTranslationBundle
I see you've found a converter, but to answer your first question about generating your initial translation file -
If you have Gettext installed on your system you could generate a PO file from your "texts-to-be-translated inside the project". The command line program xgettext will scan the source files looking for whatever function you're using.
Example:
To scan PHP files for instances of the trans method call as shown here you could use the following command -
find . -name "*.php" | xargs xgettext --language=PHP --keyword=trans --output=messages.pot
To your question about editors:
You could use any PO editor, such as POEdit, to manage your translations, but as you say you eventually need to convert the PO file to either an XLIFF or YAML language pack for Symfony.
I see you've already found a converter tool. You may also like to try the one I wrote for Loco. It supports PO to YAML, and PO to XLIFF
Workaround for busy people (UNIX)
You can run the following command in the Terminal:
$ grep -rEo --no-filename "'.+'\|\btrans\b" templates/ > output.txt
This will output the list of messages to translate:
'Please provide your email'|trans
'Phone'|trans
'Please provide your phone number'|trans
...
I mean almost.. But you can usually do some work from here...
Obviously you must tweak the command to your liking (transchoice, double-quotes instead of single...).
Not ideal but can help!
grep options
grep -R, -r, --recursive: Read all files under each directory, recursively this is equivalent to the -d recurse option.
grep -E, --extended-regexp: Interpret PATTERN as an extended regular expression.
grep -o, --only-matching: Show only the part of a matching line that matches PATTERN.
grep -h, --no-filename: Suppress the prefixing of filenames on output when multiple files are searched.
(source)

How to manually call another target from a make target?

I would like to have a makefile like this:
cudaLib :
# Create shared library with nvcc
ocelotLib :
# Create shared library for gpuocelot
build-cuda : cudaLib
make build
build-ocelot : ocelotLib
make build
build :
# build and link with the shared library
I.e. the *Lib tasks create a library that runs cuda directly on the device, or on gpuocelot respectively.
For both build tasks I need to run the same build steps, only creating the library differs.
Is there an alternative to running make directly?
make build
Kind of a post-requisite?
Note: This answer focuses on the aspect of a robust recursive invocation of a different target in a given makefile.
To complement Jack Kelly's helpful answer, here's a GNU makefile snippet that demonstrates the use of $(MAKE) to robustly invoke a different target in the same makefile (ensuring that the same make binary is called, and that the same makefile is targeted):
# Determine this makefile's path.
# Be sure to place this BEFORE `include` directives, if any.
THIS_FILE := $(lastword $(MAKEFILE_LIST))
target:
#echo $# # print target name
#$(MAKE) -f $(THIS_FILE) other-target # invoke other target
other-target:
#echo $# # print target name
Output:
$ make target
target
other-target
Using $(lastword $(MAKEFILE_LIST)) and -f ... ensures that the $(MAKE) command uses the same makefile, even if that makefile was passed with an explicit path (-f ...) when make was originally invoked.
Note: While GNU make does have features for recursive invocations - for instance, variable $(MAKE) specifically exists to enable them - their focus is on invoking subordinate makefiles, not on calling a different target in the same makefile.
That said, even though the workaround above is somewhat cumbersome and obscure, it does use regular features and should be robust.
Here is the link to the manual section covering recursive invocations ("sub-makes"):
Recursive Use of make
Most versions of make set a variable $(MAKE) that you can use for recursive invocations.
As you have written it, the build target will need to do something different depending on whether you have just done an ocelot or cuda build. That's another way of saying you have to parameterise build in some way. I suggest separate build targets (much like you already have), with associated variables. Something like:
build-cuda: cudaLib
build-ocelot: ocelotLib
build-cuda build-ocelot:
shell commands
which invoke ${opts-$#}
On the command-line you type make build-cuda (say). Make first builds cudaLib, then it carries out the recipe for build-cuda. It expands the macros before calling the shell. $# in this case is build-cuda, thus ${opts-$#} is first expanded to ${opts-build-cuda}. Make now goes on to expand ${opts-build-cuda}. You will have defined opts-build-cuda (and of course its sister opts-build-ocelot) elsewhere in the makefile.
P.S. Since build-cuda et. al. are not real files, you had better tell make this (.PHONY: build-cuda).

How can I get a list of all preprocessor symbols used (or defined) within a file?

I have a number of C/C++ project files. I'd like to know the full list of preprocessor symbols used by the files. Is there a flag to gcc, or is there some tool I can use to get this list.
Optionally, if the tool also told me the list of symbols defined by the file, that would be great.
Use gcc -E -dM <file_list> - preprocess, then output #defines.
My gcc is a tad rusty, so I'm not sure whether or not you explicitly need the -E, but try both?
For further reference, see this

Make source with two targets

I use this tool called Lazy C++ which breaks a single C++ .lzz file into a .h and .cpp file. I want Makepp to expect both of these files to exist after my rule for building .lzz files, but I'm not sure how to put two targets into a single build line.
I've never used Makepp personally, but since it's a drop-in replacement for GNU Make, you should be able to do something like:
build: foo.h foo.cpp
g++ $(CFLAGS) foo.cpp -o $(LFLAGS) foo
foo.h foo.cpp: foo.lzz
lzz foo.lzz
Also not sure about the lzz invocation there, but that should help. You can read more about this at http://theory.uwinnipeg.ca/gnu/make/make_37.html.
Lzz is amazing! This is just what I was looking for http://groups.google.com/group/comp.lang.c++/browse_thread/thread/c50de73b70a6a957/f3f47fcdcfb6bc09
Actually all you need is to depend (typically) on foo.o in your link rule, and a pattern rule to call lzz:
%.cpp %.h: %.lzz
lzz $(input)
The rest will fall into place automatically. When compiling any source that includes foo.h, or linking foo.o to a library or program, lzz will first get called automatically.
Makepp will also recognize if only the timestamp but not the content of the produced file changed, and ignore that. But it can't hurt to give it less to do, by using the lzz options to suppress recreating an identical file.
Regards -- Daniel

Resources