I'm trying to remove all the .svn directories from a working directory.
I thought I would just use find and rm like this:
find . -iname .svn -exec 'rm -rf {}' \;
But the result is:
find: rm -rf ./src/.svn: No such file or directory
Obviously the file exists, or find wouldn't find it... What am I missing?
You shouldn't put the rm -rf {} in single quotes.
As you've quoted it the shell is treating all of the arguments to -exec it as a command rather than a command plus arguments, so it's looking for a file called "rm -rf ./src/.svn" and not finding it.
Try:
find . -iname .svn -exec rm -rf {} \;
Just by-the-bye, you should probably get out of the habit of using -exec for things that can be done to multiple files at once. For instance, I would write that out of habit as
find . -iname .svn -print | xargs rm -rf
or, since I'm now using a Macintosh and more likely to encounter file or directory names with spaces in them
find . -iname .svn -print0 | xargs -0 rm -rf
"xargs" makes sure that "rm" is invoked only once every "N" files, where "N" is usually 20. That's not a huge win in this case, because rm is small, but if the program you wanted to execute on every file was large or did a lot of processing on start up, it could make things a lot faster.
maybe its just me, but the old find & rm script does not work on my current config, a la:
find /data/bin/test -type d -mtime +10 -name "[0-9]*" -exec rm -rf {} \;
whereas the xargs solution does, a la:
find /data/bin/test -type d -mtime +10 -name '[0-9]*' -print | xargs rm -rf ;
no idea why, but i've updated my scriptLib so i dont spend another couple hours beating
my head on something so simple....
(running RHEL under kernel-2.6.18-194.11.3.el5)
EDIT: found out why - my RHEL distro defaults vi to insert the dreaded cr into line breaks (whch breaks the command) - following suggestions from nx5000 & jliagre at linuxquestions.org, added the following to ~/.vimrc:
:set fileformat=unix
map <F4> :set fileformat=unix<CR>
map <F5> :set fileformat=dos<CR>
which allows the behavior to pivot on the F4/F5.
to check whether CR's are embedded in your file:
head -1 scriptFile.sh | od -c | head -1
http://www.linuxquestions.org/questions/linux-general-1/bad-interpreter-no-such-file-or-directory-213617/
You can also use the svn command as follows:
svn export <url-to-repo> <dest-path>
Look here for more info.
Try
find . -iname .svn -exec rm -rf {} \;
and that probably ought to work IIRC.
You can pass anything you want in quotes, with the following trick.
find . -iname .svn -exec bash -c 'rm -rf {}' \;
The exec option will be happy to see that you're simply calling an executable with an argument, but your argument will be able to contain a script of basically any size and shape.
find . -iname .svn -exec bash -c '
ls -l "{}" | wc -l
' \;
Related
I have a list of certain files that I see using the command below, but how can I copy those files listed into another folder, say ~/test?
find . -mtime 1 -exec du -hc {} +
Adding to Eric Jablow's answer, here is a possible solution (it worked for me - linux mint 14 /nadia)
find /path/to/search/ -type f -name "glob-to-find-files" | xargs cp -t /target/path/
You can refer to "How can I use xargs to copy files that have spaces and quotes in their names?" as well.
Actually, you can process the find command output in a copy command in two ways:
If the find command's output doesn't contain any space, i.e if the filename doesn't contain a space in it, then you can use:
Syntax:
find <Path> <Conditions> | xargs cp -t <copy file path>
Example:
find -mtime -1 -type f | xargs cp -t inner/
But our production data files might contain spaces, so most of time this command is effective:
Syntax:
find <path> <condition> -exec cp '{}' <copy path> \;
Example
find -mtime -1 -type f -exec cp '{}' inner/ \;
In the second example, the last part, the semi-colon is also considered as part of the find command, and should be escaped before pressing Enter. Otherwise you will get an error something like:
find: missing argument to `-exec'
find /PATH/TO/YOUR/FILES -name NAME.EXT -exec cp -rfp {} /DST_DIR \;
If you're using GNU find,
find . -mtime 1 -exec cp -t ~/test/ {} +
This works as well as piping the output into xargs while avoiding the pitfalls of doing so (it handles embedded spaces and newlines without having to use find ... -print0 | xargs -0 ...).
This is the best way for me:
cat filename.tsv |
while read FILENAME
do
sudo find /PATH_FROM/ -name "$FILENAME" -maxdepth 4 -exec cp '{}' /PATH_TO/ \; ;
done
I basically want to add a string to all the files in a directory that are locked. I'm having trouble passing the filenames to a mv command:
find . -flags uchg -exec chflags nouchg "{}" | mv "{}" "{}"_LOCK \;
The above code obviously doesnt work but I think it explains what I'm trying to do.
I'm facing two problems:
Adding a string to the end of a filename but before the extension (001_LOCK.jpg).
Passing the output of the find command twice. I need to do this because it won't let me change the names of the files while they are locked. So I need to unlock the file and then rename it.
Does anyone have any ideas?
This should be a good start.
I assume you do not pipe chflags to mv, which doesn't make sense, but just rename the file if chflags fails. Processing the extension is more tricky but is certainly doable.
find . -flags uchg -exec sh -c "chflags nouchg \$0 || mv \$0 \$0_LOCK" {} \;
Edit: rename if chflags succeeds:
find . -flags uchg -exec sh -c "chflags nouchg \$0 && mv \$0 \$0_LOCK" {} \;
I have hundreds of directories and files in one directory.
What is the best way deleting only directories (no matter if the directories have anything in it or not, just delete them all)
Currently I use ls -1 -d */, and record them in a file, and do sed, and then run it. It rather long way. I'm looking for better way deleting only directories
To delete all directories and subdirectories and leave only files in the working directory, I have found this concise command works for me:
rm -r */
It makes use of bash wildcard */ where star followed by slash will match only directories and subdirectories.
find . -maxdepth 1 -mindepth 1 -type d
then
find . -maxdepth 1 -mindepth 1 -type d -exec rm -rf '{}' \;
To add an explanation:
find starts in the current directory due to . and stays within the current directory only with -maxdepth and -mindepth both set to 1. -type d tells find to only match on things that are directories.
find also has an -exec flag that can pass its results to another function, in this case rm. the '{}' \; is the way these results are passed. See this answer for a more complete explanation of what {} and \; do
First, run:
find /path -d -type d
to make sure the output looks sane, then:
find /path -d -type d -exec rm -rf '{}' \;
-type d looks only for directories, then -d makes sure to put child directories before the parent.
Simple way :-
rm -rf `ls -d */`
find command only (it support file deletion)\
find /path -depth -type d -delete
-type d looks only for directories, then -depth makes sure to put child directories before the parent. -delete removing filtered files/folders
In one line:
rm -R `ls -1 -d */`
(backquotes)
I am using AIX.
When I try to copy all the file in a folder to another folder with the following command:
cp ./00012524/*.PDF ./dummy01
The shell complains:
ksh: /usr/bin/cp: 0403-027 The parameter list is too long.
How to deal with it? My folder contain 8xxxx files, how can I copy them very fast? each file have size of 4x kb to 1xx kb.
Use find command in *nix:
find ./00012524 -type f -name "*.PDF" -exec cp {} ./dummy01/ \; -print
The cp command has a limitation of files which you can copy simultaneous.
One possibility you can copy them using multiple times the cp command bases on your file patterns, like:
cp ./00012524/A*.PDF ./dummy01
cp ./00012524/B*.PDF ./dummy01
cp ./00012524/C*.PDF ./dummy01
...
cp ./00012524/*.PDF ./dummy01
You can also copy trough find command:
find ./00012524 -name "*.PDF" -exec cp {} ./dummy01/ \;
$ ( cd 00012524; ls | grep '\.PDF$' | xargs -I{} cp {} ../dummy01/ )
The -t flag to cp is useful here:
find ./00012524 -name \*.PDF -print | xargs cp -t ./dummy01
The best command to copy large number of files from one directory to another.
find /path/to/source/ -name "*" -exec cp -ruf "{}" /path/to/destination/ \;
This helped me a lot.
You should be able use a for loop, e.g.
for f in $(ls ./00012524/*.pdf)
do
cp $f ./dummy01
done
I have no way of testing this, but it should work in theory.
you can do something like this and grab each line of the directory
# you can use the -rv to check the status of the command verbose
for i in /from_dir/*; do cp -rv "$i" /to_dir/; done
Does anyone have a good shell line for this?
I want to check the age on a directory. If I created multiple directories on a weekly basis and I want to purge them/delete them based on 7 days later for example.
How would I do that?
This will let you do a dry run, remove the echo if you like the output
find /path/to/toplevel -type d -mtime +7 -exec echo rm -rf {} +
Update
If you have an older version of find that doesn't comply with POSIX 2004 then use this instead:
find /path/to/toplevel -type d -mtime +7 -exec echo rm -rf {} \;
or
find /path/to/toplevel -type d -mtime +7 -print0 | xargs -0 echo rm -rf {}
The former terminated by \; will call rm for each directory it finds, the latter with xargs will attempt to call rm as few times as possible by passing multiple directories to a single call to rm and thus be much faster. The latter also has identical behavior to the first one terminated with a +