How to disable auto completion of Gnu Make version 3.82 - gnu-make

After upgrading to Gnu Make of version 3.82, I found it supports target auto completion.
For some distribution, it might really helpful
However, with OpenWrt Linux Distribution, it might be troublesome since partial build is frequently like below
# make package/path/{clean,install} V=99
But we cannot do PATH auto completion because Gnu Make now is so "Smart" that it prints out the possible make targets for your selection instead of doing PATH auto completion

After upgrading to Gnu Make of version 3.82, I found it supports
target auto completion.
Unlikely that this makes sense. Completion on your shell is done by... your shell. So remove the completion support for Make from your shell configuration.

Related

Advice on upgrading Emacs (22 -> 24), and also about GUI vs console/terminal

I'm using a Macbook Pro (Snow Leopard, 10.6.8) and have been a regular emacs user for the past few months. I'm trying to install a modified version of Emacs 24.2 provided here to utilize Emacs Speaks Statistics (ESS) from the downloads page. I currently have 22.1.1:
M-x emacs-version
GNU Emacs 22.1.1 (mac-apple-darwin) of 2011-06-07 on b1030.apple.com
I installed the emacs linked earlier, put it in Applications, and set this in .bashrc:
alias emacs="/Applications/Emacs.app/Contents/MacOS/Emacs -nw"
So it seems like it's working correctly as I wrote and successfully ran a short R program.
M-x emacs-version
GNU Emacs 24.2.1 (x86_64-apple-darwin, NS apple-appkit-1038.36) of 2012-08-27 on bob.porkrind.org
Is this the usual way to upgrade to a "newer version" of Emacs? Sorry if this question seems trivial, but I've never done this before (I typically used emacs on a different computer) and the Installation step on the previous website consists of just one sentence. The all-in-one installation method also isn't explained in the official documentation.
A brief side note while I was searching on the web: I believe calling 'emacs file_name' should open a GUI version, while 'emacs -nw file_name' is the console, so I remain using the terminal. But on my Mac, using emacs has the same effect as using emacs -nw. In other words, I can't get a GUI or separate window to show up. Can anyone confirm that this Super User question has the 'correct' answer? (I don't really have a problem with this, as I hate having another pop-up window, but it would be nice to know for completeness.)
The Emacs that comes with Mac OS X /usr/bin/emacs does not have a graphical interface, just the terminal one, so calling emacs is the same as emacs -nw.
Your upgraded Emacs by default starts with the graphical interface, so you need to specify -nw to force it to use the terminal.
There is no "usual" way to upgrade OS X's default Emacs (i.e. Apple does not provide an upgraded Emacs); what you've done is fine. Or you could install a binary from http://emacsformacosx.com/ or use a package manager like homebrew.

Setting up "configure" for openMP in R

I have an R package which is easily sped up by using OpenMP. If your compiler supports it then you get the win, if it doesn't then the pragmas are ignored and you get one core.
My problem is how to get the package build system to use the right compiler options and libraries. Currently I have:
PKG_CPPFLAGS=-fopenmp
PKG_LIBS=-fopenmp
hardcoded into src/Makevars on my machine, and this builds it with OpenMP support. But it produces a warning about non-standard compiler flags on check, and will probably fail hard on a machine with no openMP capabilities.
The solution seems to be to use configure and autoconf. There's some information around here:
http://cran.r-project.org/doc/manuals/R-exts.html#Using-Makevars
including a complex example to compile in odbc functionality. But I can't see how to begin tweaking that to check for openmp and libgomp.
None of the R packages I've looked at that talk about using openMP seem to have this set up either.
So does anyone have a walkthrough for setting up an R package with OpenMP?
[EDIT]
I may have cracked this now. I have a configure.ac script and a Makevars.in with #FOO# substitutions for the compiler options. But now I'm not sure of the workflow. Is it:
Run "autoconf configure.in > configure; chmod 755 configure" if I change the configure.in file.
Do a package build.
On package install, the system runs ./configure for me and creates Makevars from Makevars.in
But just to be clear, "autoconf configure.in > configure" doesn't run on package install - its purely a developer process to create the configure script that is distributed - amirite?
Methinks you have the library option wrong, please try
## -- compiling for OpenMP
PKG_CXXFLAGS=-fopenmp
##
## -- linking for OpenMP
PKG_LIBS= -fopenmp -lgomp
In other words, -lgomp gets you the OpenMP library linked. And I presume you know that this library is not part of the popular Rtools kit for Windows. On a modern Linux you should be fine.
In an unrelease testpackage I have here I also add the following to PKG_LIBS, but that is mostly due to my use of Rcpp:
$(shell $(R_HOME)/bin/Rscript -e "Rcpp:::LdFlags()") \
$(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)
Lastly, I think the autoconf business is not really needed unless you feel you need to test for OpenMP via configure.
Edit: SpacedMan is correct. Per the beginning of the libgomp-4.4 manual:
1 Enabling OpenMP
To activate the OpenMP extensions for
C/C++ and Fortran, the compile-time
flag `-fopenmp' must be specified.
This enables the OpenMP directive
[...] The flag also
arranges for automatic linking of the
OpenMP runtime library.
So I stand corrected. Seems that it doesn't hurt to manually add what would get added anyway, just for clarity...
Just addressing your question regarding the usage of autoconf--no, you do not want to run autoconf with any arguments, nor should you redirect its output. You are correct that running autoconf to build the configure script is something that the package maintainer does, and the resulting configure script is distributed. Normally, to generate the configure script from configure.ac (older packages use the name configure.in, but that name has been discouraged for several years), the developer simply runs autoconf with no arguments. Before running autoconf, it is necessary to run aclocal, autoheader, libtoolize, etc... There is also a tool (autoreconf) which simplifies the process and invokes all the required programs in the correct order. It is now more typical to run autoreconf instead of autoconf.

Why does configure.sh think win32 is Unix?

I'm trying to build an application from source in windows that requires some Unix tools. I think it's the apparently standard ./configure; make; make install (there's no INSTALL file). First I tried MinGW but got confused that there was no bash, autoconf, m4, or automake exes in \bin. I'm sure I missed something obvious but I installed Cygwin anyways just to move forward. For some reason when I run
sh configure.sh
I get:
platform unix
compiler cc
configuration directory ./builds/unix
configuration rules ./builds/unix/unix.mk
My OS has identity problems. Obviously the makefile is all wrong since I'm not on unix but win32. Why would the configure script think this? I assume it has something to do with Cygwin but if I remove that I can't build it at all. Please help; I'm very confused.
Also is it possible to build using MinGW? What's the command for bash and is mingw32-make the same as make? I noticed they're different sizes.
Everything is fine. When you are inside CygWin, you are basically emulating an UNIX. sh runs inside CygWin, and thus identifies the OS correctly as Unix.
Have a look at GCW - The Gnu C compiler for Windows
Also, you might be interested in this help page, that goes into some detail about the minimal system (MSYS), such as how to install, configure et. c.
That should help you get bash, configure and the rest to work for MinGW as well.
From the Cygwin home page
Cygwin is a Linux-like environment for Windows. It consists of two parts:
A DLL (cygwin1.dll) which acts as a Linux API emulation layer providing substantial Linux API functionality.
A collection of tools which provide Linux look and feel.
Since configure is using the Cygwin environment, it is interacting against the emulation layer and so it is just like it's working on a Unix environment.
Have you tried building the application and seeing if it works?

GNU make --jobs option in QMAKE

I am using qmake to generate MinGW32 Makefiles for a small Qt C++ app we are developing. My problem: all those dual/quad core CPUs are sitting there idly while only one thread is doing the building. In order to parallelize things I tried passing --jobs 4 to make, but the problem is that qmake generates a generic makefile inside of which make gets called again with -f .
Is it possible to force qmake to add options to make when generating the makefile? Or maybe there's another way of setting the option outside of qmake altogether? I can't edit that specific Makefile since it's autogenerated each build.
Abusing $MAKE to pass options does not work in all cases. Oftentimes, (e.g. in the configure script of Qt on Unix), it's enclosed in double quotes ("$MAKE") to allow the command to contain spaces. I know because I used the same trick before it stopped working. Qt Support then suggested (rightfully) to use $MAKEFLAGS as in
set MAKEFLAGS=-j4
make
This works for me:
set MAKE_COMMAND=mingw32-make -j%NUMBER_OF_PROCESSORS%
The generic Makefile uses $(MAKE) when invoking make, so you can overwrite it using environment variables. Something like this should do it:
qmake
make MAKE="mingw32-make -j4"
Replace the values of MAKE as required of course :)

tool for building software

I need something like make i.e. dependencies + executing shell commands where failing command stops make execution.
But more deeply integrated with shell i.e. now in make each line is executed in separate context so it is not easy to set variable in one line and use it in following line (I do not want escape char at end of line because it is not readable).
I want simple syntax (no XML) with control flow and functions (what is missing in make).
It does not have to have support for compilation. I have to just bind together several components built using autotools, package them, trigger test and publish results.
I looked at: make, ant, maven, scons, waf, nant, rake, cons, cmake, jam and they do not fit my needs.
take a look at doit
you can use shell commands or python functions to define tasks (builds).
very easy to use. write scripts in python. "no api" (you dont need to import anything in your script)
it has good support to track dependencies and targets
Have a look at fabricate.
If that does not fulfill your needs or if you would rather not write your build script in Python, you could also use a combination of shell scripting and fabricate. Write the script as you would to build your project manually, but prepend build calls with "fabricate.py" so build dependencies are managed automatically.
Simple example:
#!/bin/bash
EXE="myapp"
CC="fabricate.py gcc" # let fabricate handle dependencies
FILES="file1.c file2.c file3.c"
OBJS=""
# build link
for F in $FILES; do
echo $CC -c $F
if [ $? -ne 0 ]; then
echo "Build failed while compiling $F" >2
exit $?
fi
OBJS="$OBJS ${F/.c/.o}"
done
# link
$CC -o $EXE $OBJS
Given that you want control flow, functions, everything operating in the same environment and no XML, it sounds like you want to use the available shell script languages (sh/bash/ksh/zsh), or Perl (insert your own favourite scripting language here!).
I note you've not looked at a-a-p. I'm not familiar with this, other than it's a make system from the people who brought us vim. So you may want to look over that.
A mix of makefile and a scripting language to choose which makefile to run at a time could do it.
I have had the same needs. My current solution is to use makefiles to accurately represent the graph dependency (you have to read "Recursive make considered harmful"). Those makefiles trigger bash scripts that take makefiles variables as parameters. This way you have not to deal with the problem of shell context and you get a clear separation between the dependencies and the actions.
I'm currently considering waf as it seems well designed and fast enough.
You might want to look at SCons; it's a Make-replacement written in Python.

Resources