Copy a file using makefile - gnu-make

I created the following makefile to generate a.pdf and then copy it to b.pdf.
all:
arara a.tex
rm *.dvi *.aux *.log
cp a.pdf b.pdf
The first two commands run correctly: meaning a file named a.pdf is generated, and the temporary *.dvi, *.aux, *.log files are removed.
But the cp command does not run: meaning b.pdf is not created. Any idea why? I tried cp -f as well.
(arara is a utility to compile a tex file and generate a pdf file, its details are not important for this question)

The error turned out to be in the rm command and not in the cp command. One of the extensions *.aux did not exist and so rm threw an error. I resolved it by adding -f to the rm command.

Related

Error compiling R locally via bash file

I am trying to compile R locally using a bash file, but it fails and shows the following error:
tar: command not found
cd: R-3.2.5: No such file or directory
./configure: No such file or directory
make: *** No targets specified and no makefile found. Stop.
make: *** No rule to make target `install'. Stop.
R-3.2.5/lib64/R/bin/R: No such file or directory
sed: command not found
mv: command not found
tar: command not found
Bellow is the bash file I am submitting:
#!/bin/bash
tar -xzvf R-3.2.5.tar.gz
cd R-3.2.5
./configure --prefix=$(pwd)
make
make install
cd ..
R-3.2.5/lib64/R/bin/R -e 'install.packages(c("BGLR"),
repos="http://cran.cnr.berkeley.edu/", dependencies = TRUE)'
sed -i '/^R_HOME_DIR=/c R_HOME_DIR=$(pwd)/R' R-3.2.5/lib64/R/bin/R
mv R-3.2.5/lib64/R ./
tar -czvf R.tar.gz R/
When I run the same command lines directly on terminal it works fine, but when I try to run them using a bash file it fails.
Does anyone have an idea how to make it work?
The bash instance used to run the script doesn't seem to have the $PATH variable correctly set, so it can't find tar and the other commands.
Try replacing the 1st line with #!/bin/bash -l. Add echo Path: $PATH as the 2nd line and see if one of the directory listed actually contains tar. You should get something like /bin:/sbin:/usr/bin/.

Makefile to render all targets of all .Rmd files in directory

My aim is to have a universal Makefile which I can copy into each directory where I have an RMD file, which will, upon calling make in this directory, render all targets defined in all .Rmd files in this directory.
The Makefile below works for only renders the last file as expected. I am sure I am doing something simple wrong.
How do I have to modify the Makefile so that it does what it is supposed to do?
Also: when I run make a second time, all files are generated again, although no SOURCE files changed.
I have the following Makefile:
SOURCES=$(shell find . -name "*.Rmd")
TARGETS_pdf=$(SOURCES:%.Rmd=%.pdf)
TARGETS_html=$(SOURCES:%.Rmd=%.html)
TARGETS_nb_html=$(SOURCES:%.Rmd=%.nb.html)
TARGETS_docx=$(SOURCES:%.Rmd=%.docx)
default: $(SOURCES)
$(info Generating defined targets from $(SOURCES))
#echo "$< -> $#"
#Rscript -e "rmarkdown::render('$<', output_format = 'all')"
clean:
rm -rf $(TARGETS_pdf)
rm -rf $(TARGETS_html)
rm -rf $(TARGETS_nb_html)
rm -rf $(TARGETS_docx)
Thanks.
When you run make it executes the first rule it finds. In your case it is default. It checks then if this file exists. If it does not, the script is run, which is supposed to generate the target file (default). Your script does not do that. That is why next time make runs, it starts all over again. If the file exists, the script does not need to be run.
What you could do is this:
SOURCES=$(shell find . -name "*.Rmd")
TARGET = $(SOURCES:%.Rmd=%.pdf) $(SOURCES:%.Rmd=%.html) $(SOURCES:%.Rmd=%.nb.html) $(SOURCES:%.Rmd=%.docx)
%.docx %.nb.html %.html %.pdf: %.Rmd
Rscript -e "rmarkdown::render('$<', output_format = 'all')"
default: $(TARGET)
clean:
rm -rf $(TARGET)

rule to run a command on newly generated files with gnumake

I am very new with make. I have a perl script that generates three latex files. I want to create a makefile that would execute the perl script and then run lualatex on the newly generates tex files. So far, I have the following:
make:
perl diff.pl
pdf:
make
$(eval LIST := $(shell ls *.tex))
lualatex $(LIST).tex
make clean
clean:
rm -rf *.log *.aux
Output:
lualatex FLAT_FLAT_AVDD.tex FLAT_FLAT_VDD.tex FLAT_FLAT_VSS.tex.tex
And I only get one pdf FLAT_FLAT_AVDD.pdf.
How can I run lualatex on all the files?
I can just declare three variables and then run make. But, how can I automate this? Is there a loop concept in make? What is a better way to achieve this with "hard-coding" the file names?
Thanks.
EDIT:
I tried to incorporate foreach.
make:
perl diff.pl
list:
$(eval LIST := $(shell ls *.tex))
pdf:
make list
$(foreach tex,$(LIST),$(lualatex $(tex)))
make clean
clean:
rm -rf *.log *.aux
and then I ran, make pdf
I got the following output in terminal.
dedehog01:tislam1:243 > make pdf
make list
make[1]: Entering directory `/home/tislam1/Documents/THESIS/Script_v0.1/BOX_approach/Modified_Layout_mesh/IR_Report_mesh/flat_flat/make'
make[1]: `list' is up to date.
make[1]: Leaving directory `/home/tislam1/Documents/THESIS/Script_v0.1/BOX_approach/Modified_Layout_mesh/IR_Report_mesh/flat_flat/make'
make clean
make[1]: Entering directory `/home/tislam1/Documents/THESIS/Script_v0.1/BOX_approach/Modified_Layout_mesh/IR_Report_mesh/flat_flat/make'
rm -rf *.log *.aux
make[1]: Leaving directory `/home/tislam1/Documents/THESIS/Script_v0.1/BOX_approach/Modified_Layout_mesh/IR_Report_mesh/flat_flat/make'
Assuming you know the names of the three .tex files in advance, you can unconditionally run the perl, then update the .tex files only if the new ones actually differ from the old ones. Make will handle this fine.
tex := 1.tex 2.tex 3.tex
intermediates := ${tex:%.tex=%-new.tex}# 1-new.tex 2-new.tex 3-new.tex
pdfs := ${intermediates:%.tex=%.pdf}# 1-new.pdf etc.
.PHONY: perl
perl: ; perl diff.pl
${intermediates}: %-new.tex: %.tex | perl
cmp -s $< $# || mv $< $#
${pdfs}: %.pdf: %.tex
lualatex $<
.PHONY: all
all: ${pdfs}
: $# Success
Let's say the perl produces 1.tex, 2.tex and 3.tex
We unconditionally run the perl producing new copies of these three files.
Then we update 1-new.tex from 1.tex, but only if the two files differ.
Make notices any files that have changed and runs lualatex as appropriate.
This is parallel safe (a good test of any makefile). Run with -j3 to get 3 copies of lualatex running at once. You do have 4 CPUs don't you?
If you want to run your perl script on each make invocation, make is not really useful. A shell script could do the same. But if you really want to put all this a makefile:
.PHONY: all clean
all:
perl diff.pl && \
for t in *.tex; do \
lualatex $$t; \
done
clean:
rm -rf *.log *.aux
Else, you must know in advance the list of LaTeX sources. And you should probably stick with make target-prerequisites philosophy, that is, express all dependencies and the corresponding recipes:
LATEXSOURCES := foo.tex bar.tex ... cuz.tex
PDFS := $(patsubst %.tex,%.pdf,$(LATEXSOURCES))
all: $(PDFS)
$(LATEXSOURCES): diff.pl
perl diff.pl
$(PDFS): %.pdf: %.tex
lualatex $<
clean:
rm -rf *.log *.aux
But, as bobbogo noticed, with this second option, the perl script will be run as many times as there are LaTeX source files. A pattern rule solves this:
LATEXSOURCES := foo.tex bar.tex ... cuz.tex
PDFS := $(patsubst %.tex,%.pdf,$(LATEXSOURCES))
all: $(PDFS)
$(LATEXSOURCES): %.tex: diff.pl
#echo "Rebuilding $(LATEXSOURCES)"
perl diff.pl
$(PDFS): %.pdf: %.tex
lualatex $<
clean:
rm -rf *.log *.aux
Now we have a true solution that:
rebuilds the LaTeX source files only if one is missing or if the perl script changed since they were last built,
executes the perl script only once to build all LaTeX source files,
expresses all dependencies between the various files.
There is still a problem, however: if only one LaTeX source file is missing and the others are up-to-date, the perl script will be run, all LaTeX source files will be rebuilt and their last modification time will thus be changed. Only the missing one will be compiled but, on the next make invocation, the others will also be re-compiled, which is a waste. bobbogo's proposal of using intermediate LaTeX sources solves this.

Can't drop a folder in Unix

I was trying to create some folders from a text file (arbo.txt which contains a list of directories) using:
xargs --verbose -d\n mkdir -p < /applis/arbo.txt
I guess the -d\n is not correct since I got other folders than those in arbo.txt file.
The problem is that now I'm not able to drop these folders, I tried:
rm f
rm -rf f
There are no errors, but the folder is not dropped (I can see it using ls), and when I try:
cd f
I get:
-ksh: cd: f: [No such file or directory]
Edit:
using ls I can see that the folder name is: f?.
How can I drop this folder?
Thanks
Try to use:
> xargs -n1 mkdir -p < dirs.txt
Otherwise, help out with some info on f:
> ls -l f
> file f
If what appears as f contains unprintable characters, using the file name expansion of the shell may save you the trouble to figure them out exactly. Be careful not to delete other important files that match the same name pattern!
rm -rf f*
In you command it add the ? to the folder name f. You can use the below to delete the file.
rm -rf f\?
UPDATE:-
rm -rf '<file name >'
The file name is,
All the contents in the arbo.txt file without any change.
Because your command creates only one folder with the name as all the contents in the arbo.txt file including the new line also. after that it add the ? to before each new line.
To get the folder name as easily, you can type starting name of the folder name and give tab. it give the full name of the folder.

Zip command without including the compressed dir itself

Suppose the structure:
/foo/bar/
--file1
--file2
--file3
--folder1
--file4
--folder2
--file5
I want to run the unix zip utility, compressing the bar folder and all of it's files and subfolders, from foo folder, but not have the bar folder inside the zip, using only command line.
If I try to use the -j argument, it doesn't create the bar folder inside the zip as I want, but doesn't create folder1 and folder2. Doing -rj doesn't work.
(I know I can enter inside bar and do zip -r bar.zip . I want to know if it's possible to accomplish what $/foo/bar/ zip -r bar.zip . but doing it from $/foo).
You have to do cd /foo/bar then zip -r bar.zip ., however, you can group them with parentheses to run in a subshell:
# instead of: cd /foo/bar; zip -r bar.zip; cd -
( cd /foo/bar; zip -r bar.zip . )
The enclosed (paren-grouped) commands are run in a subshell and cd within it won't affect the outer shell session.
See sh manual.
Compound Commands
A compound command is one of the following:
(list) list is executed in a subshell environment (see COMMAND EXECUTION ENVIRONMENT below).
Variable assignments and builtin commands that affect the shell's environment do not remain in effect after the command completes.
The return status is the exit status of list.
zip doesn't have a -C (change directory) command like tar does
you can do:
cd folder1 && zip -r ../bar.zip *
from within a command line shell
or you can use bsdtar which is a version of tar from libarchive that can create zips
bsdtar cf bar.zip --format zip -C folder1 .
(this creates a folder called ./ -- not sure a way around that)
I can't speak for the OP's reasoning. I was looking for this solution as well.
I am in the middle of coding a program that creates an .ods by building the internal xml files and zipping them together. They must be in the root dir of the archive, or you get an error when you try and run OOo.
I'm sure there is a dozen other ways to do this:
create a blank .ods file in OOo named blank.ods, extract to dir named blank, then try running:
cd blank && zip -r ../blank.ods *
The way I wrote mine, the shell closes after one command, so I don't need to navigate back to the original directory, if you do simply add && cd .. to the command line:
cd blank && zip -r ../blank.ods * && cd ..

Resources