Name for GNU Make $(var:=suffix) syntax - gnu-make

Evidently, GNU Make supports the syntax $(var:=suffix), which does the same thing as $(addsuffix suffix,$(var)) as far as I can tell, except that suffix can contain , in the := version without the use of a variable.
What is this form of expansion called?
Evidently it operates on whitespace-delimited words, producing a new string without modifying the original variable.
This file
# Makefile
words=cat dog mouse triangle
$(info $(words:=.ext))
$(info $(words:=.ext))
all:
#true
produces the following when run:
$ make
cat.ext dog.ext mouse.ext triangle.ext
cat.ext dog.ext mouse.ext triangle.ext

Related

Set the default target(s) of a Makefile (GNU Make) via Environment Variable

I'd like to set the default target(s) of a Makefile to the space-deliminted value of an Environment Variable.
For this solution to work correctly, it must be possible for me to set an environment variable, then run make and have the target, (or optionally space-delimited targets), contained in the environment variable be used as though they were passed as targets.
DEFAULT_TARGETS="target target2" make
# ... should produce the same result as ...
make target1 target2
Thanks for the help!
You can do it by adding something like this to your makefile:
ifneq ($(DEFAULT_TARGETS),)
__default_targets: $(DEFAULT_TARGETS)
.DEFAULT_GOAL = __default_targets
endif
Following #Andreas lead I've landed on this ...
.PHONY: DEFAULT_TARGET
DEFAULT_TARGETS ?= intro
DEFAULT_TARGET: $(DEFAULT_TARGETS)
This use the "First Target as Default" paradigm, then expands the contents of the DEFAULT_TARGETS env-var to add requisites. It also has the advantage that if the DEFAULT_TARGETS environment variable is not set, the makefile will use the fall-back value, in this case that's intro.
In the shell this would look like, DEFAULT_TARGETS="target target2" make

Access file name (extension) with READNULLCMD

A nice shortcut in Zsh for catting files is, with Python file type for example:
<somefile.py
But it's much nicer if that file is syntax-highlighted. So the trick is to use a tool like bat instead of the default cat:
READNULLCMD=bat
This actually works when a shebang is present since Bat will look for it. BUT, the file type detection by extension might not be possible since the input is simply seen as STDIN. And since most files don't have a shebang line, file name extension is a necessary fallback in order to detect file type.
There is this method for debugging READNULLCMD, using a function. I've tried wrapping in set -x, grepping env, etc, but not finding a way to see the name. If I could see the name, then something like this could be used:
mynullcmd() { bat -l $stdin_filename:x } # get extension and use as file type
READNULLCMD=mynullcmd
Question: Is there some way for Zsh to know what's being passed in as STDIN? Can it know that the command contained somefile.py?
Settle for an alias, like c is short for cat equivalent:
alias c=bat
c somefile.py
Other viable highlighters include coderay and pygmentize, but I've found bat to be the most capable in speed and breadth of language support.

How to pass bash variable into R script

I have a couple of R scripts that processes data in a particular input folder. I have a few folders I need to run this script on, so I started writing a bash script to loop through these folders and run those R scripts.
I'm not familiar with R at all (the script was written by a previous worker and it's basically a black box for me), and I'm inexperienced with passing variables through scripts, especially involving multiple languages. There's also an issue present when I call source("$SWS_output/Step_1_Setup.R") here - R isn't reading my $SWS_output as a variable, but rather a string.
Here's my bash script:
#!/bin/bash
# Inputs
workspace="`pwd`"
preprocessed="$workspace/6_preprocessed"
# Output
SWS_output="$workspace/7_SKSattempt4_results/"
# create output directory
mkdir -p $SWS_output
# Copy data from preprocessed to SWS_output
cp -a $preprocessed/* $SWS_output
# Loop through folders in the output and run the R code on each folder
for qdir in $SWS_output/*/; do
qdir_name=`basename $qdir`
echo -e 'source("$SWS_output/Step_1_Setup.R") \n source("$SWS_output/(Step_2_data.R") \n q()' | R --no-save
done
I need to pass the variable "qdir" into the second R script (Step_2_data.R) to tell it which folder to process.
Thanks!
My previous answer was incomplete. Here is a better effort to explain command line parsing.
It is pretty easy to use R's commandArgs function to process command line arguments. I wrote a small tutorial https://gitlab.crmda.ku.edu/crmda/hpcexample/tree/master/Ex51-R-ManySerialJobs. In cluster computing this works very well for us. The whole hpcexample repo is open source/free.
The basic idea is that in the command line you can run R with command line arguments, as in:
R --vanilla -f r-clargs-3.R --args runI=13 parmsC="params.csv" xN=33.45
In this case, my R program is a file r-clargs-3.R and the arguments that the file will import are three space separated elements, runI, parmsC, xN. You can add as many of these space separated parameters as you like. It is completely at your discretion what these are called, but it is required they are separated by spaces and there is NO SPACE around the equal signs. Character string variables should be quoted.
My habit is to name the arguments with suffix "I" to hint that it is an integer, "C" is for character, and "N" is for floating point numbers.
In the file r-clargs-3.R, include some code to read the arguments and sort through them. For example, my tutorial's example
cli <- commandArgs(trailingOnly = TRUE)
args <- strsplit(cli, "=", fixed = TRUE)
The rest of the work is sorting through the args, and this is my most evolved stanza to sort through arguments (because it looks for suffixes "I", "N", "C", and "L" (for logical)), and then it coerces the inputs to the correct variable types (all input variables are characters, unless we coerce with as.integer(), etc):
for (e in args) {
argname <- e[1]
if (! is.na(e[2])) {
argval <- e[2]
## regular expression to delete initial \" and trailing \"
argval <- gsub("(^\\\"|\\\"$)", "", argval)
}
else {
# If arg specified without value, assume it is bool type and TRUE
argval <- TRUE
}
# Infer type from last character of argname, cast val
type <- substring(argname, nchar(argname), nchar(argname))
if (type == "I") {
argval <- as.integer(argval)
}
if (type == "N") {
argval <- as.numeric(argval)
}
if (type == "L") {
argval <- as.logical(argval)
}
assign(argname, argval)
cat("Assigned", argname, "=", argval, "\n")
}
That will create variables in the R session named paramsC, runI, and xN.
The convenience of this approach is that the same base R code can be run with 100s or 1000s of command parameter variations. Good for Monte Carlo simulation, etc.
Thanks for all the answers they were very helpful. I was able to get a solution that works. Here's my completed script.
#!/bin/bash
# Inputs
workspace="`pwd`"
preprocessed="$workspace/6_preprocessed"
# Output
SWS_output="$workspace/7_SKSattempt4_results"
# create output directory
mkdir -p $SWS_output
# Copy data from preprocessed to SWS_output
cp -a $preprocessed/* $SWS_output
cd $SWS_output
# Loop through folders in the output and run the R code on each folder
for qdir in $SWS_output/*/; do
qdir_name=`basename $qdir`
echo $qdir_name
export VARIABLENAME=$qdir
echo -e 'source("Step_1_Setup.R") \n source("Step_2_Data.R") \n q()' | R --no-save --slave
done
And then the R script looks like this:
qdir<-Sys.getenv("VARIABLENAME")
pathname<-qdir[1]
As a couple of comments have pointed out, this isn't best practice, but this worked exactly as I wanted it to. Thanks!

Can one append to a make variable without overwriting what's set in the Makefile?

Let's consider the following Makefile:
.PHONY : all
OPTS += -DBLA
OPTS += -DBLUBB
STUFF = default
all :
./do_something $(OPTS) $(STUFF)
One can pass variables on the command line. So with the following call
confus#confusion:/tmp/$ make STUFF=foo
make will run ./do_something -DBLA -DBLUBB foo.
Contrary to what I thought one can't append to variables:
confus#confusion:/tmp/$ make STUFF+=foo OPTS+=-DMOREOPT
will simply run ./do_something -DMOREOPT foo (as if I had left out the plus signs), when I'd expect it to ./do_something -DBLA -DBLUBB -DMOREOPT default foo.
Is there a way to append to a make variable with a command line option?
If this is GNU make, you have to use the override directive in your makefile to specify that you want the values set in the makefile to take precedence over the command line values:
override OPTS += -DBLA
override OPTS += -DBLUBB
override STUFF += default
If it matters, note that this will put the settings provided on the command line first, and the settings in the makefile last.

Printing hard copies of code

I have to hand in a software project that requires either a paper or .pdf copy of all the code included.
One solution I have considered is grouping classes by context and doing a cat *.extension > out.txt to provide all the code, then by catting the final text files I should have a single text file that has classes grouped by context. This is not an ideal solution; there will be no page breaks.
Another idea I had was a shell script to inject latex page breaks in between files to be joined, this would be more acceptable. Although I'm not too adept at scripting or latex.
Are there any tools that will do this for me?
Take a look at enscript (or nenscript), which will convert to Postscript, render in columns, add headers/footers and perform syntax highlighting. If you want to print code in a presentable fashion, this works very nicely.
e.g. here's my setting (within a zsh function)
# -2 = 2 columns
# -G = fancy header
# -E = syntax filter
# -r = rotated (landscape)
# syntax is picked up from .enscriptrc / .enscript dir
enscript -2GrE $*
For a quick solution, see a2ps, followed by ps2pdf. For a nicer, more complex solution I would go for a simple script that puts each file in a LaTeX listings environment and combines the result.

Resources