Unix - how to source multiple shell scripts in a directory? - unix

when I want to execute some shell script in Unix (and let's say that I am in the directory where the script is), I just type:
./someShellScript.sh
and when I want to "source" it (e.g. run it in the current shell, NOT in a new shell), I just type the same command just with the "." (or with the "source" command equivalent) before it:
. ./someShellScript.sh
And now the tricky part. When I want to execute "multiple" shell scripts (let's say all the files with .sh suffix) in the current directory, I type:
find . -type f -name *.sh -exec {} \;
but "what command should I use to "SOURCE" multiple shell scripts in a directory"?
I tried this so far but it DIDN'T work:
find . -type f -name *.sh -exec . {} \;
and it only threw this error:
find: `.': Permission denied
Thanks.

for file in *.sh
do . $file
done

Try the following version of Jonathan's code:
export IFSbak = $IFS;export IFS="\0"
for file in `find . -type f -name '*.sh' -print0`
do source "$file"
end
export IFS=$IFSbak
The problem lies in the way shell's work, and that '.' itself is not a command (neither is source in this). When you run find, the shell will fork itself and then turn into the find process, meaning that any environment variables or other environmental stuff goes into the find process (or, more likely, find forks itself for new processes for each exec).
Also, note that your command (and Jonathan's) will not work if there are spaces in the file names.

You can use find and xargs to do this:
find . -type f -name "*.sh" | xargs -I sh {}

Related

rsync with find in the files-from

I'm trying to transfer files which have been updated in the 31 days. I'm trying to run
/bin/rsync --remove-source-files --files-from="<(find /tmp/rsync/source -type f -mtime -31 -print0)" /tmp/rsync/source /tmp/rsync/destination
However when trying this, i keep receiving the following error:
failed to open files-from file <(find /tmp/rsync/source -type f -mtime -31 -print0): No such file or directory
The directory exists and is accessible.
This is the output of the find:
$ find /tmp/rsync/source -type f -mtime -31
/tmp/rsync/source/testfile2.txt
/tmp/rsync/source/testfile.txt
/tmp/rsync/source/sourcefile.txt
/bin/rsync --remove-source-files --files-from="<(find /tmp/rsync/source -type f -mtime -31 -print0)" /tmp/rsync/source /tmp/rsync/destination
<( ... ) is Bash Shell Process Substitution - https://www.gnu.org/software/bash/manual/html_node/Process-Substitution.html.
So:
You have to be using Bash, i.e. as your command-line shell if running interactively, or as the shell script interpreter, if running in a shell script.
If not using Bash on command-line nor in a shell script or if you are passing that command to something that is executing the rsync program directly and not through a Bash shell, it won't work.
You can't put that syntax inside double quotes as shown. When quoted like that, Bash interprets it as a filename (thus the error).
It's unfortunate this this is coming up first in search results when other questions about this have much better answers.
I've seen people use the --files-from="<(...)" notation several places, but I don't see any reference to it in the rsync manual. Maybe it's a special shell syntax for some people? or distro-added feature? I get the same error message as above when I try to use that notation.
The official way to do it is to either write your list of files into a real file:
find . -mtime -7 > past_week.txt
rsync --files-from=past_week.txt SRC/ DST/
or to pipe the list of files on stdin:
find . -mtime -7 | rsync --files-from=- SRC/ DST/
The single dash - filename here means STDIN, and this is a common convention among unix tools.
If you are worried about files with newlines in the file name, you should use NUL delimiters for the list. (but beware this will also affect --include-from and --exclude-from)
find . -mtime -7 -print0 | rsync -0 --files-from=- SRC/ DST/
You can also list out the files on the command line like rsync -av `find . -mtime -7` DST/ but that doesn't preserve their hierarchy in the tree, and if you have more than a few files that will create a massive command line, and may fail to execute if it exceeds the limit of the operating system.
Tried, this works good for me
find /tmp/rsync/source -type f -mtime -31 | rsync -rcvh /tmp/rsync/source/ /tmp/rsync/destination/ --dry-run
Remove dry-run for actual execution
Since i was forced to use a pre-existing script which was parsing on "{" brackets and where you couldn't run commands before the rsync script i was unable to use the above mentioned solutions.
However i was able to use the following to get it working:
/bin/rsync --recursive --remove-source-files `find /tmp/rsync/source -type f -mtime -31` /tmp/rsync/destination

bash find with two commands in an exec ~ How to find a specific Java class within a set of JARs

My use case is I want to search a collection of JARs for a specific class file. More specifically, I want to search recursively within a directory for all *.jar files, then list their contents, looking for a specific class file.
So this is what I have so far:
find . -name *.jar -type f -exec echo {} \; -exec jar tf {} \;
This will list the contents of all JAR files found recursively. I want to put a grep within the seconed exec because I want the second exec to only print the contents of the JAR that grep matches.
If I just put a pipe and pipe it all to grep afterward, like:
find . -name *.jar -type f -exec echo {} \; -exec jar tf {} \; | grep $CLASSNAME
Then I lose the output of the first exec, which tells me where the class file is (the name of JAR file is likely to not match the class file name).
So if there was a way for the exec to run two commands, like:
-exec "jar tf {} | grep $CLASSNAME" \;
Then this would work. Using a grep $(...) in the exec command wouldn't work because I need the {} from the find to take the place of the file that was found.
Is this possible?
(Also I am open to other ways of doing this, but the command line is preferred.)
i find it difficult to execute multiple commands within find-exec, so i usually only grab the results with find and loop around the results.
maybe something like this might help?
find . -type f -name *.jar | while read jarfile; do echo $jarfile; jar tf $jarfile; done
I figured it out - still using "one" command. What I was looking for was actually answered in the question How to use pipe within -exec in find. What I have to do is use a shell command with my exec. This ends up making the command look like:
find . -name *.jar -type f -exec echo {} \; -exec sh -c "jar tf {} | grep --color $CLASSNAME" \;
The --color will help the final result to stick out while the command is recursively listing all JAR files.
A couple points:
This assumes I have a $CLASSNAME set. The class name has to appear as it would in a JAR, not within a Java package. So com.ibm.json.java.JSONObject would become com/ibm/json/java/JSONObject.class.
This requires a JDK - that is where we get the jar command. The JDK must be accessible on the system path. If you have a JDK that is not on the system path, you can set an environment variable, such as JAR to point to the jar executable. I am running this from cygwin, so it turns out my jar installation is within the "Program Files" directory. The presence of a space breaks this, so I have to add these two commands:
export JAR=/cygdrive/c/Program\ Files/Java/jdk1.8.0_65/bin/jar
find . -name *.jar -type f -exec echo {} \; -exec sh -c "\"$JAR\" tf {} | grep --color $CLASSNAME" \;
The $JAR in the shell command must be escaped otherwise the terminal will not know what to do with the space in "Program Files".

Formatting Find output before it's used in next command

I am batch uploading files to an FTP server with find and curl using this command:
find /path/to/target/folder -not -path '*/\.*' -type f -exec curl -u username:password --ftp-create-dirs -T {} ftp://ftp.myserver.com/public/{} \;
The problem is find is outputting paths like
/full/path/from/root/to/file.txt
so on the FTP server I get the file uploaded to
ftp.myserver.com/public/full/path/from/root/to/file.txt
instead of
ftp.myserver.com/public/relative/path/to/file.txt
The goal was to have all files and folders that are in the target folder get uploaded to the public folder, but this problem is destroying the file structure. Is there a way to edit the find output to trim the path before it gets fed into curl?
Not sure exactly what you want to end up with in your path, but this should give you an idea. The trick is to exec sh to allow you to modify the path and run a command.
find . -type f -exec sh -c 'joe=$(basename {}); echo $joe' \;

UNIX: rename files piped from find command

I basically want to add a string to all the files in a directory that are locked. I'm having trouble passing the filenames to a mv command:
find . -flags uchg -exec chflags nouchg "{}" | mv "{}" "{}"_LOCK \;
The above code obviously doesnt work but I think it explains what I'm trying to do.
I'm facing two problems:
Adding a string to the end of a filename but before the extension (001_LOCK.jpg).
Passing the output of the find command twice. I need to do this because it won't let me change the names of the files while they are locked. So I need to unlock the file and then rename it.
Does anyone have any ideas?
This should be a good start.
I assume you do not pipe chflags to mv, which doesn't make sense, but just rename the file if chflags fails. Processing the extension is more tricky but is certainly doable.
find . -flags uchg -exec sh -c "chflags nouchg \$0 || mv \$0 \$0_LOCK" {} \;
Edit: rename if chflags succeeds:
find . -flags uchg -exec sh -c "chflags nouchg \$0 && mv \$0 \$0_LOCK" {} \;

Unix shell file copy flattening folder structure

On the UNIX bash shell (specifically Mac OS X Leopard) what would be the simplest way to copy every file having a specific extension from a folder hierarchy (including subdirectories) to the same destination folder (without subfolders)?
Obviously there is the problem of having duplicates in the source hierarchy. I wouldn't mind if they are overwritten.
Example: I need to copy every .txt file in the following hierarchy
/foo/a.txt
/foo/x.jpg
/foo/bar/a.txt
/foo/bar/c.jpg
/foo/bar/b.txt
To a folder named 'dest' and get:
/dest/a.txt
/dest/b.txt
In bash:
find /foo -iname '*.txt' -exec cp \{\} /dest/ \;
find will find all the files under the path /foo matching the wildcard *.txt, case insensitively (That's what -iname means). For each file, find will execute cp {} /dest/, with the found file in place of {}.
The only problem with Magnus' solution is that it forks off a new "cp" process for every file, which is not terribly efficient especially if there is a large number of files.
On Linux (or other systems with GNU coreutils) you can do:
find . -name "*.xml" -print0 | xargs -0 echo cp -t a
(The -0 allows it to work when your filenames have weird characters -- like spaces -- in them.)
Unfortunately I think Macs come with BSD-style tools. Anyone know a "standard" equivalent to the "-t" switch?
The answers above don't allow for name collisions as the asker didn't mind files being over-written.
I do mind files being over-written so came up with a different approach. Replacing each / in the path with - keep the hierarchy in the names, and puts all the files in one flat folder.
We use find to get the list of all files, then awk to create a mv command with the original filename and the modified filename then pass those to bash to be executed.
find ./from -type f | awk '{ str=$0; sub(/\.\//, "", str); gsub(/\//, "-", str); print "mv " $0 " ./to/" str }' | bash
where ./from and ./to are directories to mv from and to.
If you really want to run just one command, why not cons one up and run it? Like so:
$ find /foo -name '*.txt' | xargs echo | sed -e 's/^/cp /' -e 's|$| /dest|' | bash -sx
But that won't matter too much performance-wise unless you do this a lot or have a ton of files. Be careful of name collusions, however. I noticed in testing that GNU cp at least warns of collisions:
cp: will not overwrite just-created `/dest/tubguide.tex' with `./texmf/tex/plain/tugboat/tubguide.tex'
I think the cleanest is:
$ find /foo -name '*.txt' | xargs -i cp {} /dest
Less syntax to remember than the -exec option.
As far as the man page for cp on a FreeBSD box goes, there's no need for a -t switch. cp will assume the last argument on the command line to be the target directory if more than two names are passed.

Resources