I am trying to get cron to move my books into a book folder automatically but it seems that it doesn't work recursively.
Here's the code:
find /mnt/Storage/Torrents/Completed -R -type f -name "*.epub" -exec mv {} /mnt/Storage/Books\;
Thanks!
Related
A site I now manage I found has been corrupted. I would like to keep the content in place, but copy all of the php, txt, and css files from a temporary WordPress installation and move them to the corresponding location using a script.
I don't know how to make a bash or shell script that does something like this:
#!/usr/bash
type = [*\.php|*\.css|*\.ini|*\.txt]
find /temporary/WordPress/ -type f -name '$type' {} + > file-paths-in-temporary-wordpress ;
egrep -o '[a-zA-Z]\.[php|css|ini|txt]' file-paths-in-temporary-wordpress > file-names-of-temporary-WordPress-Installation
find /old/installation/WordPress -type f -name '$type' {} + > file-paths-to-use-as-reference
while read $type in file-names-of-temporary-WordPress-Installation ; do
// locating file-names-of-temporary-WordPress-Installation in old WordPress site, copy files from file-paths-in-temporary-wordpress to the matching locations in the old WordPress installation //
I am confused about how to get this to work. Obviously, this is sorely incomplete.
My desired outcome is to have all of the php, ini, css, and txt files from the fresh WordPress installation copied to the corresponding location at the old WordPress site.
I can use:
find /temporary/WordPress -type f -name '*.php' -exec cp -fvr {} /old/WordPress/Installation/ + ;
find /temporary/WordPress -type f -name '*.css' -exec cp -fvr {} /old/WordPress/Installation/ + ;
..etc.
Any thoughts?
Please help. Thank you!
Why can't you just search each directory and copy if there is a match?
cp /temorary/WordPress/*.php /new/directory/
cp /temporary/WordPress/*.css /new/directory/
...
You can copy everything first and remove things you do not need:
cp -r /temporary/WordPress /old/WordPress/
find /old/WordPress/ -type f -regex ".*\.\(php\|css\|ini\|txt\)" -exec rm {} \;
This might leave empty directories and is fixing things that went wrong (copying files you do not want).
So the right approach is only copying files you need. First go to /temporary/WordPress so you do not need to cut off that dir:
cd /temporary/WordPress
find . -type f -regex ".*\.\(php\|css\|ini\|txt\)" | while read file; do
dir="/old/WordPress/${file%/*}"
mkdir -p "${dir}" 2>/dev/null
cp "${file}" "/old/WordPress/${file}"
done
(Sorry, not tested)
I had a directory with many files and sub-directories. To move only the sub-directories, I just learned you can use:
ls -d BASEDIR/*/ | xargs -n1 -I% mv % TARGETDIR/
I use the following:
$ mv ./*/ DirToMoveTo
For example:
Say I wanted to move all directories with "Old" in the name to a folder called "Old_Dirs" on /data.
The command would look like this:
mv ./*Old*/ /data/
Why not use find?
find . -maxdepth 1 -type d -exec mv '{}' /tmp \;
-maxdepth 1 makes sure find won't go deeper than current directory
-type d tells find to only find directories
-exec execute a command with the result of the find referenced by {}
In my opinion a cleaner solution and it also works better then using xargs when you have files with white space or tabs in them.
With a file structure like this:
/dir2move2
/dir
/subdir1
/subdir2
index.js
To move only the sub directories and not the files you could just do:
mv ./dir/*/ ./dir2move2
Possible solution:
find BASEDIR/ -maxdepth 1 -mindepth 1 -type d -exec mv '{}' TARGETDIR \;
you can simply use the same command for moving a file but put a slash after the name of the subdirectory
sudo mv */ path/to/destination
sudo mv subdir/ path/to/subdirectory
My dir sturcture that looks like
x
/log
/bin
I am giving this command from dir- x/bin
find ../log -type f -name \*.log -mtime +90 -exec ls -l {} \;
(to find and display list of files older than 90 days.) and it doesn't display anything.
Whereas if i execute same command in dir- x/log
find . -type f -name \*.log -mtime +90 -exec ls -l {} \;
it gives me a list of files older than 90 days.
Can you please help?
Recall that paths are relative.
If you have a dir sturcture that looks like
x
/log
/bin
AND your're in x/bin then you need to give the relative path to x/log, ie
pwd
x/bin
find ../x/log -type f -name \*.log -mtime +90 -exec ls -l {} \;
I hope this helps.
Two suggestions.
First, escape the * using \*. If you have any log files in current dir, they will get expanded before the command is executed.
Second, I think you mean find ../x/log ...?
I have hundreds of directories and files in one directory.
What is the best way deleting only directories (no matter if the directories have anything in it or not, just delete them all)
Currently I use ls -1 -d */, and record them in a file, and do sed, and then run it. It rather long way. I'm looking for better way deleting only directories
To delete all directories and subdirectories and leave only files in the working directory, I have found this concise command works for me:
rm -r */
It makes use of bash wildcard */ where star followed by slash will match only directories and subdirectories.
find . -maxdepth 1 -mindepth 1 -type d
then
find . -maxdepth 1 -mindepth 1 -type d -exec rm -rf '{}' \;
To add an explanation:
find starts in the current directory due to . and stays within the current directory only with -maxdepth and -mindepth both set to 1. -type d tells find to only match on things that are directories.
find also has an -exec flag that can pass its results to another function, in this case rm. the '{}' \; is the way these results are passed. See this answer for a more complete explanation of what {} and \; do
First, run:
find /path -d -type d
to make sure the output looks sane, then:
find /path -d -type d -exec rm -rf '{}' \;
-type d looks only for directories, then -d makes sure to put child directories before the parent.
Simple way :-
rm -rf `ls -d */`
find command only (it support file deletion)\
find /path -depth -type d -delete
-type d looks only for directories, then -depth makes sure to put child directories before the parent. -delete removing filtered files/folders
In one line:
rm -R `ls -1 -d */`
(backquotes)
I'm trying to remove all the .svn directories from a working directory.
I thought I would just use find and rm like this:
find . -iname .svn -exec 'rm -rf {}' \;
But the result is:
find: rm -rf ./src/.svn: No such file or directory
Obviously the file exists, or find wouldn't find it... What am I missing?
You shouldn't put the rm -rf {} in single quotes.
As you've quoted it the shell is treating all of the arguments to -exec it as a command rather than a command plus arguments, so it's looking for a file called "rm -rf ./src/.svn" and not finding it.
Try:
find . -iname .svn -exec rm -rf {} \;
Just by-the-bye, you should probably get out of the habit of using -exec for things that can be done to multiple files at once. For instance, I would write that out of habit as
find . -iname .svn -print | xargs rm -rf
or, since I'm now using a Macintosh and more likely to encounter file or directory names with spaces in them
find . -iname .svn -print0 | xargs -0 rm -rf
"xargs" makes sure that "rm" is invoked only once every "N" files, where "N" is usually 20. That's not a huge win in this case, because rm is small, but if the program you wanted to execute on every file was large or did a lot of processing on start up, it could make things a lot faster.
maybe its just me, but the old find & rm script does not work on my current config, a la:
find /data/bin/test -type d -mtime +10 -name "[0-9]*" -exec rm -rf {} \;
whereas the xargs solution does, a la:
find /data/bin/test -type d -mtime +10 -name '[0-9]*' -print | xargs rm -rf ;
no idea why, but i've updated my scriptLib so i dont spend another couple hours beating
my head on something so simple....
(running RHEL under kernel-2.6.18-194.11.3.el5)
EDIT: found out why - my RHEL distro defaults vi to insert the dreaded cr into line breaks (whch breaks the command) - following suggestions from nx5000 & jliagre at linuxquestions.org, added the following to ~/.vimrc:
:set fileformat=unix
map <F4> :set fileformat=unix<CR>
map <F5> :set fileformat=dos<CR>
which allows the behavior to pivot on the F4/F5.
to check whether CR's are embedded in your file:
head -1 scriptFile.sh | od -c | head -1
http://www.linuxquestions.org/questions/linux-general-1/bad-interpreter-no-such-file-or-directory-213617/
You can also use the svn command as follows:
svn export <url-to-repo> <dest-path>
Look here for more info.
Try
find . -iname .svn -exec rm -rf {} \;
and that probably ought to work IIRC.
You can pass anything you want in quotes, with the following trick.
find . -iname .svn -exec bash -c 'rm -rf {}' \;
The exec option will be happy to see that you're simply calling an executable with an argument, but your argument will be able to contain a script of basically any size and shape.
find . -iname .svn -exec bash -c '
ls -l "{}" | wc -l
' \;