Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I need to rm files from a unix directory that only belong to my id. I tried building this command, but to no avail:
ls -la | grep 'myid' | awk ' { print $9 } ' | rm
My result: Usage: rm [-firRe] [--] File...
You were really close. Try:
rm `ls -la | grep 'myid' | awk ' { print $9 } '`
Note that those are backticks, not single quotes surrounding the first three segments from your original pipeline. Also for me the filename column was $8, but if $9 is the right column for you, then that should do it.
find . -user myuser -print0 |xargs -0 rm
Put your own userid (or maybe user number) in for "myuser".
rm doesn't read from stdin.
find -user $(whoami) -delete
Please always test without the delete first.
Try with find where you can search for files belonging to a user and then delete them:
find . -user *username* -delete
More info: man find
rm does not accept a list of files to delete on the stdin (which is what you are doing by passing it through the pipe.
Try this
find . -type f -user username -exec rm -f {} \;
Delete files of user_name from folder /tmp (you can replace this with your folder) older than 60 days - you ca use any date here but make sure you keep evidence in a deleted.txt file in user_name home folder:
find /tmp -user user_name -mtime +60 -exec rm -rfv {} \; >> /home/user_name/deleted.txt
You could use find:
find . -maxdepth 1 -type f -user myid -print0 | xargs -0 rm -f
Drop the -maxdepth 1 if you want it to handle subdirectories as well.
Related
find . -name '{fileNamePattern}*.bz2' | xargs -n 1 -P 3 bzgrep -H "{patternToSearch}"
I am using the command above to find out a .bz2 file from set of files that have a pattern that I am looking for. It does go through the files because I can see the pattern that I am trying to find being printed on the console but I don't see the file name.
If you look at the bzgrep script (for example this version for OS X) you will see that it pipes the output from bzip2 through grep. That process loses the original filenames. grep never sees them so it cannot print them out (despite your -H flag).
Something like this should do, not exactly what you want but something similar. (You could get the prefix you were expecting by piping the output from bzgrep into sed/awk but that's a bit less simple of a command to write out.)
find . -name '{fileNamePattern}*.bz2' -printf '### %p\n' -exec bzgrep "{patternToSearch}" {} \;
I printed the file name through echo command and xargs.
find . -name "*bz2" | parallel -j 128 echo -n {}\" \" | xargs bzgrep {pattern}
Etan is very close with his answer: grep indeed does not show the filename when only dealing with one file, so you can make grep believe he's looking into multiple files, just by adding the NULL file, so the command becomes:
find . -name '{fileNamePattern}*.bz2' -printf '### %p\n'
-exec bzgrep "{patternToSearch}" {} /dev/null \;
(It's a dirty trick but it's helping me already for more than 15 years :-) )
The following find command will results multiple files and send mail all those
find /home/cde -ctime -1 -name "Sum*pdf*" -exec uuencode {} {} \; |mailx -s "subject" abc#gmail.com
but I am getting attachments like "homecdeSum123.pdf" and "homecdeSum324.pdf". How to get exact file names in my attachment. Please help me on this
Trying to answer what seems to be at least part of your question
but I am getting attachments like "homecdeSum123.pdf" and "homecdeSum324.pdf". How to get exact file names in my attachment.
The accepted answer to this question:
find: What's up with basename and dirname?
contains a lot of useful information in my opinion, but I am trying to extract what could be the answer to the above:
What you describe is that when you are doing
find /home/cde ....
you are getting file names like "homecdeSum123.pdf" so my take is that you only want the basename of the file, not also the directory.
This can be accessed like this (as an example only listing)
find `pwd` -name "*.png" -exec echo $(basename {} ) \;
A slight variation of this is to use -execdir instead of -exec which is (taking from the man page):
-execdir command {} +
Like -exec, but the specified command is run from the subdirectory containing the matched file, which is not normally the directory in which you started find.
Does this help?
all attachments in single mail:
find /home/cde -ctime -1 -name "Sum*pdf*" | while read name; do uuencode "$name" "${name##*/}"; done | mailx -s "subject" abc#example.com
to get separate mail:
find /home/cde -ctime -1 -name "Sum*pdf*" | while read name; do uuencode "$name" "${name##*/}"| mailx -s "subject" abc#example.com ; done
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I need to remove all the files in the current directory except one file, say abc.txt. Is there any command to rm all the other files in the directory except abc.txt?
If you're after a succinct command, then with extended globbing in bash, you should be able to use:
rm !(abc.txt)
There are however several caveats to this approach.
This will run rm on all entries in the directory (apart from "abc.txt") and this includes subdirectories. You will therefore end up with the "cannot remove directory" error if subdirs exist. If this is the case, use find instead:
find . -maxdepth 1 -type f \! -name "abc.txt" -exec rm {} \;
# omit -maxdepth 1 if you also want to delete files within subdirectories.
If !(abc.txt) returns a very long list of files, you will potentially get the infamous "argument list too long" error. Again, find would be the solution to this issue.
rm !(abc.txt) will fail if the directory is empty or if abc.txt is the only file. Example:
[me#home]$ ls
abc.txt
[me#home]$ rm !(abc.txt)
rm: cannot remove `!(abc.txt)': No such file or directory
You can workaround this using nullglob, but it can often be cleaner to simply use find. To illustrate, a possible workaround would be:
shopt -s nullglob
F=(!(abc.txt)); if [ ${#F[*]} -gt 0 ]; then rm !(abc.txt); fi # not pretty
1)
mv abc.txt ~/saveplace
rm *
mv ~/saveplace/abc.txt .
2)
find . ! -name abc.txt -exec rm {} "+"
Try
find /your/dir/here -type f ! -name abc.txt -exec rm {} \;
Providing you don't have file with space in the name, you can use a for to loop on the result of ls:
for FILE in `ls -1`
do
if [[ "$FILE" != "abc.txt" ]]; then
rm $FILE
fi
done
You could write it as a script, or you can write it directly at bash prompt: write the first line and press enter, then you can write the other lines and bash will wait for you to write done before executing. Otherwise you can write is in a single line:
for FILE in `ls -1`; do if [[ "$FILE" != "abct.txt" ]]; then rm $FILE; fi; done
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have two unix partitions under debian which I would like to merge (disk space problems :/). What would be the easiest way to do it? I think it would be best to tar or copy files from one partition to the other, delete one and resize the other. I will use parted to resize but how should I copy the files? There are links, permissions and devices which need to be moved without change.
You could run the following (as root) to copy the files. It works for symlinks, devices and ordinary files.
cd /partition2
tar cf - . | ( cd /partition1 && tar xf - )
Another way is to use cpio, but I never remember the correct syntax.
Since this is Debian with GNU fileutils, cp --archive should work fine.
cp --archive --sparse=always --verbose --one-file-system --target-directory=/TARGET /ORIGIN
If for some reason you’d want to go via GNU tar, you’d need to do something like this:
cd /origin
find . -xdev -depth -not -path ./lost+found -print0 \
| tar --create --atime-preserve=system --null --files-from=- \
--format=posix --no-recursion --sparse \
| { cd /target; tar --extract --overwrite --preserve-permissions --sparse; }
(I’ve done this so many times that I’ve got a file with all these command lines for quick reference.)
Warning: Using GNU "tar" will not copy POSIX ACLs; you'll need to use either the above "cp --archive" method or "bsdtar":
mkdir /target
cd /origin
find . -xdev -depth -not -path ./lost+found -print0 \
| bsdtar -c -n --null -T - --format pax \
| { cd /target; bsdtar -x -pS -f -; }
You can also use SquashFS to create a mirror of the partition and copy that over. After resizing your 2nd partition, mount the SquashFS image and copy over the necessary files. Keep in mind that your kernel will need SquashFS support to mount the image.
SquashFS
I'm trying to remove all the .svn directories from a working directory.
I thought I would just use find and rm like this:
find . -iname .svn -exec 'rm -rf {}' \;
But the result is:
find: rm -rf ./src/.svn: No such file or directory
Obviously the file exists, or find wouldn't find it... What am I missing?
You shouldn't put the rm -rf {} in single quotes.
As you've quoted it the shell is treating all of the arguments to -exec it as a command rather than a command plus arguments, so it's looking for a file called "rm -rf ./src/.svn" and not finding it.
Try:
find . -iname .svn -exec rm -rf {} \;
Just by-the-bye, you should probably get out of the habit of using -exec for things that can be done to multiple files at once. For instance, I would write that out of habit as
find . -iname .svn -print | xargs rm -rf
or, since I'm now using a Macintosh and more likely to encounter file or directory names with spaces in them
find . -iname .svn -print0 | xargs -0 rm -rf
"xargs" makes sure that "rm" is invoked only once every "N" files, where "N" is usually 20. That's not a huge win in this case, because rm is small, but if the program you wanted to execute on every file was large or did a lot of processing on start up, it could make things a lot faster.
maybe its just me, but the old find & rm script does not work on my current config, a la:
find /data/bin/test -type d -mtime +10 -name "[0-9]*" -exec rm -rf {} \;
whereas the xargs solution does, a la:
find /data/bin/test -type d -mtime +10 -name '[0-9]*' -print | xargs rm -rf ;
no idea why, but i've updated my scriptLib so i dont spend another couple hours beating
my head on something so simple....
(running RHEL under kernel-2.6.18-194.11.3.el5)
EDIT: found out why - my RHEL distro defaults vi to insert the dreaded cr into line breaks (whch breaks the command) - following suggestions from nx5000 & jliagre at linuxquestions.org, added the following to ~/.vimrc:
:set fileformat=unix
map <F4> :set fileformat=unix<CR>
map <F5> :set fileformat=dos<CR>
which allows the behavior to pivot on the F4/F5.
to check whether CR's are embedded in your file:
head -1 scriptFile.sh | od -c | head -1
http://www.linuxquestions.org/questions/linux-general-1/bad-interpreter-no-such-file-or-directory-213617/
You can also use the svn command as follows:
svn export <url-to-repo> <dest-path>
Look here for more info.
Try
find . -iname .svn -exec rm -rf {} \;
and that probably ought to work IIRC.
You can pass anything you want in quotes, with the following trick.
find . -iname .svn -exec bash -c 'rm -rf {}' \;
The exec option will be happy to see that you're simply calling an executable with an argument, but your argument will be able to contain a script of basically any size and shape.
find . -iname .svn -exec bash -c '
ls -l "{}" | wc -l
' \;