Recursively execute latexmk -c on folders - recursion

I'd like to execute the command latexmk -c on all of the directories inside a directory, e.g.,
.
./test-dir
The effect is to remove all of the auxiliary files created curing latex compilation.
I've tried using the find command to tell me about the directories and then execute a command, like so:
find -type d -exec latexmk -c \;
But unfortunately, that command only has the effect of removing the auxiliary files in the directory in which I call it, not in the subdirectory (test-dir in this example).

I had a very similar problem: I wanted to convert .tex files recursively. The final solution for me was:
find . -name '*.tex' -execdir latexmk -pdf {} \;
The secret here is to use -execdir instead of -exec which executes the command in the directory the file was found. So the solution to your problem is most likely:
find -type d -execdir latexmk -c \;

Related

Modify permissions for scripts recursively

Trying to set permissions recursively for multiple scripts:
chmod -R /export/home/*.sh
But some files do not have .sh extension, e.g. .ksh and some scripts have no extension at all.
How can I make all scripts/files executable? (any type, extension or not)
This command should do the trick:
find /export/home -type f -exec chmod +x {} \;
However, I don't think you really want to make all files executable - there is significant security risk in this. You'd be better off just determining what files should be executable, or putting them in a "bin/" subdirectory, which you could then search for:
find /export/home -type d -name bin -exec chmod -R +x {} \;

bash find with two commands in an exec ~ How to find a specific Java class within a set of JARs

My use case is I want to search a collection of JARs for a specific class file. More specifically, I want to search recursively within a directory for all *.jar files, then list their contents, looking for a specific class file.
So this is what I have so far:
find . -name *.jar -type f -exec echo {} \; -exec jar tf {} \;
This will list the contents of all JAR files found recursively. I want to put a grep within the seconed exec because I want the second exec to only print the contents of the JAR that grep matches.
If I just put a pipe and pipe it all to grep afterward, like:
find . -name *.jar -type f -exec echo {} \; -exec jar tf {} \; | grep $CLASSNAME
Then I lose the output of the first exec, which tells me where the class file is (the name of JAR file is likely to not match the class file name).
So if there was a way for the exec to run two commands, like:
-exec "jar tf {} | grep $CLASSNAME" \;
Then this would work. Using a grep $(...) in the exec command wouldn't work because I need the {} from the find to take the place of the file that was found.
Is this possible?
(Also I am open to other ways of doing this, but the command line is preferred.)
i find it difficult to execute multiple commands within find-exec, so i usually only grab the results with find and loop around the results.
maybe something like this might help?
find . -type f -name *.jar | while read jarfile; do echo $jarfile; jar tf $jarfile; done
I figured it out - still using "one" command. What I was looking for was actually answered in the question How to use pipe within -exec in find. What I have to do is use a shell command with my exec. This ends up making the command look like:
find . -name *.jar -type f -exec echo {} \; -exec sh -c "jar tf {} | grep --color $CLASSNAME" \;
The --color will help the final result to stick out while the command is recursively listing all JAR files.
A couple points:
This assumes I have a $CLASSNAME set. The class name has to appear as it would in a JAR, not within a Java package. So com.ibm.json.java.JSONObject would become com/ibm/json/java/JSONObject.class.
This requires a JDK - that is where we get the jar command. The JDK must be accessible on the system path. If you have a JDK that is not on the system path, you can set an environment variable, such as JAR to point to the jar executable. I am running this from cygwin, so it turns out my jar installation is within the "Program Files" directory. The presence of a space breaks this, so I have to add these two commands:
export JAR=/cygdrive/c/Program\ Files/Java/jdk1.8.0_65/bin/jar
find . -name *.jar -type f -exec echo {} \; -exec sh -c "\"$JAR\" tf {} | grep --color $CLASSNAME" \;
The $JAR in the shell command must be escaped otherwise the terminal will not know what to do with the space in "Program Files".

Formatting Find output before it's used in next command

I am batch uploading files to an FTP server with find and curl using this command:
find /path/to/target/folder -not -path '*/\.*' -type f -exec curl -u username:password --ftp-create-dirs -T {} ftp://ftp.myserver.com/public/{} \;
The problem is find is outputting paths like
/full/path/from/root/to/file.txt
so on the FTP server I get the file uploaded to
ftp.myserver.com/public/full/path/from/root/to/file.txt
instead of
ftp.myserver.com/public/relative/path/to/file.txt
The goal was to have all files and folders that are in the target folder get uploaded to the public folder, but this problem is destroying the file structure. Is there a way to edit the find output to trim the path before it gets fed into curl?
Not sure exactly what you want to end up with in your path, but this should give you an idea. The trick is to exec sh to allow you to modify the path and run a command.
find . -type f -exec sh -c 'joe=$(basename {}); echo $joe' \;

unix command to copy multiple files from different directories to one new directory

I am newbie to unix command usage.
I would like to ask if there is a way to copy multiple files from multiple directories into 1 new directory?
example:
in /tmp/dirA --> it contains file A.run.log and A.skip.log
in /tmp/dirB--> it contains file B.run.log and B.skip.log
in /tmp/dirC --> it contains file C.run.log and C.skip.log
and i would like to have all
A.run.log
A.skip.log
B.run.log
B.skip.log
C.run.log
C.skip.log
into a new folder called /tmp/dirNew
Is there a unix command that able to do it? Really appreciate it. Thank you.
JS
use this :
cp /tmp/dir?/* /tmp/dirNew
find /tmp -path "/tmp/dirNew" -prune -o -name '*.run.log' -exec cp {} /tmp/dirNew/ \;
find /tmp -path "/tmp/dirNew" -prune -o -name '*.skip.log' -exec cp {} /tmp/dirNew/ \;

Unix - how to source multiple shell scripts in a directory?

when I want to execute some shell script in Unix (and let's say that I am in the directory where the script is), I just type:
./someShellScript.sh
and when I want to "source" it (e.g. run it in the current shell, NOT in a new shell), I just type the same command just with the "." (or with the "source" command equivalent) before it:
. ./someShellScript.sh
And now the tricky part. When I want to execute "multiple" shell scripts (let's say all the files with .sh suffix) in the current directory, I type:
find . -type f -name *.sh -exec {} \;
but "what command should I use to "SOURCE" multiple shell scripts in a directory"?
I tried this so far but it DIDN'T work:
find . -type f -name *.sh -exec . {} \;
and it only threw this error:
find: `.': Permission denied
Thanks.
for file in *.sh
do . $file
done
Try the following version of Jonathan's code:
export IFSbak = $IFS;export IFS="\0"
for file in `find . -type f -name '*.sh' -print0`
do source "$file"
end
export IFS=$IFSbak
The problem lies in the way shell's work, and that '.' itself is not a command (neither is source in this). When you run find, the shell will fork itself and then turn into the find process, meaning that any environment variables or other environmental stuff goes into the find process (or, more likely, find forks itself for new processes for each exec).
Also, note that your command (and Jonathan's) will not work if there are spaces in the file names.
You can use find and xargs to do this:
find . -type f -name "*.sh" | xargs -I sh {}

Resources