I`m going to delete all files from directory that contains "2" in their name.
this command work well
ls | grep [*2*]
but when i try to pipe the output from that command to command rm
ls | grep [*2*] | rm
there is error "Try `rm --help' for more information."
please help
Why not use the wildcarding in the shell directly ?
e.g.
$ rm *2*
I don't think you need the ls or the grep. Your above problem stems from the fact that you're piping output into the stdin of rm, whereas you want to supply command line arguments to rm. rm doesn't read from stdin.
To pipe output from another command to rm you must use xargs commant for rm
Try this
ls | grep [*2*] | xargs rm
the output will send like arguments of rm command
you need to feed every line to rm command as an input. For this you need xargs along with pipe.
so modify the command like ls -1 | grep [*2*] | xargs rm -rf
just complementing on other answers, instead of running ls then grep, you could use find.
find . -name "*2*" | xargs rm
Related
Using find I create a file that contains all the files that use a specific key word:
find . -type f | xargs grep -l 'foo' > foo.txt
I want to take that list in foo.txt and maybe run some commands using that list, i.e. run an ls command on the list contained within the file.
You don't need xargs to create foo.txt. Just execute the command with -exec like this:
find . -type f -exec grep -l 'foo' {} \; > foo.txt
Then you can run ls against the file by looping through the file:
while IFS= read -r read file
do
ls "$file"
done < foo.txt
Maybe it is a little ugly, but this can also make it:
ls $(cat foo.txt)
You can use xargs like this:
xargs ls < foo.txt
The advantage of xargs is that it will execute the command with multiple arguments which is more efficient than executing the command once per argument using a loop, for example.
I would like to delete all files ending in .orig recursively from the current directory.
Will this do the trick?
ls -R | grep ".orig$" | rm
Are the results of grep passed implicitly as an argument to rm here?
How about something like:
find ./ -type f -name "*.orig" -exec rm "{}" \;
Seems to work for me, but it might be a good idea to test it with echo instead of rm first ;)
ls -R wont five quite the correct format output to pass directly to rm (through grep), as it lists files separately for each dir like:
.:
local1.orig local
./dir:
nested1.orig nested2.orig
If you wanted to do something similar using grep, you would need to use xargs like this:
grep ".orig$" | xargs rm
No, they are not. But that is the purpose of xargs:
ls -R | grep ".orig$" | xargs rm -i
will do what you want. The -i is not necessary, but is a good idea to use the first time you run this. (It will prompt you to delete a file. If you are confident that the answer is always yes, abort and re-run without the -i.)
I have a directory that has one file with information (call it masterfile.inc) and several files that are empty (call them file1.inc-file20.inc)
I'm trying to formulate an xargs command that copies the contents of masterfile.inc into all of the empty files.
So far I have
ls -ltr | awk '{print $9}' | grep -v masterfile | xargs -I {} cat masterfile.inc > {}
Unfortunately, all this does is creates a file called {} and prints masterfile.inc into it N times.
Is there something I'm missing with the syntax here?
Thanks in advance
You can use this command to copy file 20 times:
$ tee <masterfile.inc >/dev/null file{1..20}.inc
Note: file{1..20}.inc will expand to file1, file2, ... , file20
If you disternation filenames are random:
$ shopt -s extglob
$ tee <masterfile.inc >/dev/null $(ls !(masterfile.inc))
Note: $(ls !(masterfile.inc)) will expand to all file in current directory except masterfile.inc (please don't use spaces in filename)
While the tee trick is really brilliant you might be interested in a solution that is easier to adapt for other situations. Here using GNU Parallel:
ls -ltr | awk '{print $9}' | grep -v masterfile | parallel "cat masterfile.inc > {}"
It takes literally 10 seconds to install GNU Parallel:
wget pi.dk/3 -qO - | sh -x
Watch the intro videos to learn more: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
I'm using tcsh, and I'm trying to grep a path from a file with several ID, I'm doing:
grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o \$"ENV_CASTRO.*.asm"
that gets me:
$ENV_CASTRO/central/WS678/test_do_all.asm
but if I try
cp `grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o \$"ENV_CASTRO.*.asm"` .
it prompts
cp: cannot stat `$ENV_CASTRO/central/WS678/test_do_all.asm': No such file or directory
How do I tell tcsh that the output of grep contains a $ that means it is an environment variable and is not plain text?
Thanks in advance.
eval is your friend ....
eval cp `grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o \$"ENV_CASTRO.*.asm"` .
I don't have the time to create files to test this.
I hope this helps.
The problem is that the output of the grep command is not being evaluated by the shell, and so variable substitution is not happening.
One way to solve this would be to execute the desired command within another shell, for example,
sh -c "cp `grep I241149 $ENV_CASTRO/ALL_CMD_LINES.BAK | grep -o '$ENV_CASTRO.*.asm'` ."
I was given this syntax by user phi
find . | awk '!/((\.jpeg)|(\.jpg)|(\.png))$/ {print $0;}' | xargs grep "B206"
I would like to suppress the output of grep: can't open..... and find: cannot open lines from the results.sample output to be ignored:
grep: can't open ./cisc/.xdbhist
find: cannot open ./cisc/.ssh
Have you tried redirecting stderr to /dev/null ?
2>/dev/null
So the above redirects stream no.2 (which is stderr) to /dev/null. That's shell dependent, but the above should work for most. Because find and grep are different processes, you may have to do it for both, or (perhaps) execute in a subshell. e.g.
find ... 2>/dev/null | xargs grep ... 2>/dev/null
Here's a reference to some documentation on bash redirection. Unless you're using csh, this should work for most.
The option flag grep -s will suppress these messages for the grep command