Suppress find & grep "cannot open" output - unix

I was given this syntax by user phi
find . | awk '!/((\.jpeg)|(\.jpg)|(\.png))$/ {print $0;}' | xargs grep "B206"
I would like to suppress the output of grep: can't open..... and find: cannot open lines from the results.sample output to be ignored:
grep: can't open ./cisc/.xdbhist
find: cannot open ./cisc/.ssh

Have you tried redirecting stderr to /dev/null ?
2>/dev/null
So the above redirects stream no.2 (which is stderr) to /dev/null. That's shell dependent, but the above should work for most. Because find and grep are different processes, you may have to do it for both, or (perhaps) execute in a subshell. e.g.
find ... 2>/dev/null | xargs grep ... 2>/dev/null
Here's a reference to some documentation on bash redirection. Unless you're using csh, this should work for most.

The option flag grep -s will suppress these messages for the grep command

Related

Omit "Is a directory" results while using find command in Unix

I use the following command to find a string recursively within a directory structure.
find . -exec grep -l samplestring {} \;
But when I run the command within a large directory structure, there will be a long list of
grep: ./xxxx/xxxxx_yy/eee: Is a directory
grep: ./xxxx/xxxxx_yy/eee/local: Is a directory
grep: ./xxxx/xxxxx_yy/eee/lib: Is a directory
I want to omit those above results. And just get the file name with the string displayed. can someone help?
grep -s or grep --no-messages
It is worth reading the portability notes in the GNU grep documentation if you are hoping to use this code multiple places, though:
-s
--no-messages
Suppress error messages about nonexistent or unreadable files. Portability note: unlike GNU grep, 7th Edition Unix grep did not conform to POSIX, because it lacked -q and its -s option behaved like GNU grep’s -q option.1 USG-style grep also lacked -q but its -s option behaved like GNU grep’s. Portable shell scripts should avoid both -q and -s and should redirect standard and error output to /dev/null instead. (-s is specified by POSIX.)
Whenever you are saying find ., the utility is going to return all the elements within your current directory structure: files, directories, links...
If you just want to find files, just say so!
find . -type f -exec grep -l samplestring {} \;
# ^^^^^^^
However, you may want to find all files containing a string saying:
grep -lR "samplestring"
Exclude directory warnings in grep with the --exclude-dir option:
grep --exclude-dir=* 'search-term' *
Just look at the grep --help page:
--exclude-dir=PATTERN directories that match PATTERN will be skipped.

Piping into rm command

I`m going to delete all files from directory that contains "2" in their name.
this command work well
ls | grep [*2*]
but when i try to pipe the output from that command to command rm
ls | grep [*2*] | rm
there is error "Try `rm --help' for more information."
please help
Why not use the wildcarding in the shell directly ?
e.g.
$ rm *2*
I don't think you need the ls or the grep. Your above problem stems from the fact that you're piping output into the stdin of rm, whereas you want to supply command line arguments to rm. rm doesn't read from stdin.
To pipe output from another command to rm you must use xargs commant for rm
Try this
ls | grep [*2*] | xargs rm
the output will send like arguments of rm command
you need to feed every line to rm command as an input. For this you need xargs along with pipe.
so modify the command like ls -1 | grep [*2*] | xargs rm -rf
just complementing on other answers, instead of running ls then grep, you could use find.
find . -name "*2*" | xargs rm

Pipe Find command "stderr" to a command

Hi I have a peculiar problem and I'm trying hard to find (pun intended) a solution for it.
$> find ./subdirectory -type f 2>>error.log
I get an error, something like, "find: ./subdirectory/noidea: Permission denied" from this command and this will be redirected to error.log.
Is there any way I can pipe the stderr to another command before the redirection to error.log?
I want to be able to do something like
$> find ./subdirectory -type f 2 | sed "s#\(.*\)#${PWD}\1#" >> error.log
where I want to pipe only the stderr to the sed command and get the whole path of the find command error.
I know piping doesn't work here and is probably not the right way to go about.
My problem is I need both the stdout and stderr and the both have to be processed through different things simultaneously.
EDIT:
Ok. A slight modification to my problem.
Now, I have a shell script, solve_problem.sh
In this shell script, I have the following code
ErrorFile="error.log"
for directories in `find ./subdirectory -type f 2>> $ErrorFile`
do
field1=`echo $directories | cut -d / -f2`
field2=`echo $directories | cut -d / -f3`
done
Same problem but inside a shell script. The "find: ./subdirectory/noidea: Permission denied" error should go into $ErrorFile and stdout should get assigned to the variable $directories.
Pipe stderr and stdout simultaneously - idea taken from this post:
(find /boot | sed s'/^/STDOUT:/' ) 3>&1 1>&2 2>&3 | sed 's/^/STDERR:/'
Sample output:
STDOUT:/boot/grub/usb_keyboard.mod
STDERR:find: `/boot/lost+found': Brak dostępu
Bash redirections like 3>&1 1>&2 2>&3 swaps stderr and stdout.
I would modify your sample script to look like this:
#!/bin/bash
ErrorFile="error.log"
(find ./subdirectory -type f 3>&1 1>&2 2>&3 | sed "s#^#${PWD}: #" >> $ErrorFile) 3>&1 1>&2 2>&3 | while read line; do
field1=$(echo "$line" | cut -d / -f2)
...
done
Notice that I swapped stdout & stderr twice.
Small additional comment - look at -printf option in find manual page. It might be useful to you.
If you need to redirect stderr to stdout so that the following command in the pipe gets it as its input, then you can use 2>&1.
For more information, please have a look at the all about redirection how-to.
Edit: Even if you need to edit pass stdout further in the pipe, you can use sed to filter error messages and write them to a file:
$ find . -type f 2>&1 | sed '/^find:/{
> w error.log
> d
> }'
In this example:
in find command stderr is redirected to stdout
in sed command errors from find that match a regular expression are written to a file (w error.log) and removed from output (d).
any command in the pipeline following the pipeline will received the output from find.
Note: This will work as lon as all the error messages from find start with find:. Otherwise, the regular expression in sed should be modified to properly match all cases.
You can try this (on bash), which appears to work:
find ./subdirectory -type f 2> >(sed "s#\(.*\)#${PWD}\1#" >> error.log)
This does the following:
2> redirects stderr to
>(...) a process substitution (running sed, which appends to error.log)

Command passed as argument to shell script

I want to pass a command to a shell script. This command is a grep command. While executing I am getting the following errors, please help:
myscript.sh "egrep 'ERROR|FATAL' \*20100428\*.log | grep -v aString"
myscript.sh is a simple script:
#!/bin/ksh
cd log
$1
the errors are:
egrep: can't open |
egrep: can't open grep
egrep: can't open -v
egrep: can't open aString
Error is because egrap sees |, grep, -v and aString as arguments.
try this:
eval $1
You can call sh -c $1 to invoke the first argument as commands in a new shell so that the shell special characters will be expanded.

Filenames and linenumbers for the matches of cat and grep

My code
$ *.php | grep google
How can I print the filenames and linenumbers next to each match?
grep google *.php
if you want to span many directories:
find . -name \*.php -print0 | xargs -0 grep -n -H google
(as explained in comments, -H is useful if xargs comes up with only one remaining file)
You shouldn't be doing
$ *.php | grep
That means "run the first PHP program, with the name of the rest wildcarded as parameters, and then run grep on the output".
It should be:
$ grep -n -H "google" *.php
The -n flag tells it to do line numbers, and the -H flag tells it to display the filename even if there's only file. Grep will default to showing the filenames if there are multiple files being searched, but you probably want to make sure the output is consistent regardless of how many matching files there are.
grep -RH "google" *.php
Please take a look at ack at http://betterthangrep.com. The equivalent in ack of what you're trying is:
ack google --php
find ./*.php -exec grep -l 'google' {} \;
Use "man grep" to see other features.
for i in $(ls *.php); do grep -n --with-filename "google" $i; done;
find . -name "*.php" -print | xargs grep -n "searchstring"

Resources