$ ./nmap-banners -sV -vvvvv localhost |grep Banners
The outputs is:
Banner on 127.0.0.1:22/tcp matches ssh: SSH-2.0-OpenSSH_4.3p2 Debian-5ubuntu1.
Banner on 127.0.0.1:25/tcp matches smtp: 220 localhost ESMTP Exim 4.62 Wed, 14 Nov 2007 21:06:10
The question is how can I print only the IP Addresses with SSH-2.0-OpenSSH_4.3p2 Debian-5ubuntu1 using grep or awk
If you are looking for lines that contain both strings Banner and SSH-2.0-OpenSSH_4.3p2 Debian-5ubuntu1 then you can also use grep and cut.
Grep version with cut
nmap-banners -sV -vvvvv localhost|
grep 'Banner.*SSH-2.0-OpenSSH_4.3p2 Debian-5ubuntu1'|
cut -d" " -f3|
cut -d":" -f1
$ cat file
Banner on 127.0.0.1:22/tcp matches ssh: SSH-2.0-OpenSSH_4.3p2 Debian-5ubuntu1.
Banner on 127.0.0.1:25/tcp matches smtp: 220 localhost ESMTP Exim 4.62 Wed, 14 Nov 2007 21:06:10
$ awk -F'[ :]' '/SSH-2.0-OpenSSH_4.3p2/{print $3}' file
127.0.0.1
Related
Please help me to understand why I can`t write to a file
[root#192.168.1.11 ~]# echo "Hello World" >file.txt
-bash: 1.txt: Permission denied
[root#192.168.1.11 ~]# ls -lah file.txt
-rw-r--r-- 1 root root 7 May 8 14:57 file.txt
[root#192.168.1.11 ~]# id
uid=0(root) gid=0(root) groups=0(root)
I have to check the user permissions that have rwx, of all files using sed. What i have done: ls -l | sed '/^-rwx/p' - which gives me the following output:
-rwxr--r-- 1 myuser domain users 145 May 16 14:31 1.sh
-rwxr--r-- 1 myuser domain users 145 May 16 14:31 1.sh
-rwxr--r-- 1 myuser domain users 185 May 16 16:50 2.sh
-rwxr--r-- 1 myuser domain users 185 May 16 16:50 2.sh
-rw-r--r-- 1 myuser domain users 13 May 16 14:31 compiler.c
-rw-r--r-- 1 myuser domain users 2 May 16 14:28 s.txt
I'm assuming that both ls and sed are printing their outputs. With grep it works fine and only returns 1.sh and 2.sh which is correct, but it's specified to be done with sed in the exercise.
I think you can use the sed opts -n
-n, --quiet, --silent
suppress automatic printing of pattern space
So it would be ls -l | sed -n '/^-rwx/p'
I want to send mail using mailx command, in more specific way.
So, I googled how to use mailx command, I searched some questions and answers on stackoverflow, I found common answer for it which is
mailx -r "fromAddr" -s "subject" toAddr
: when i tried this command, it shows me nothing, instead it gone in infinite loop, I guess.
I am seeking information, that which SMTP server is used, what authentication is used, OR from where this command takes default values
Then I write a command by reading this and got the error
I am trying command as follows and getting error (executed with mailx -v option)
$ echo "This is the message body" | mailx -v \
> -r "abc#domain.com" \
> -s "hey, this is test" \
> -S smtp="192.168.XXX.XX:25" \
> -S smtp-use-starttls \
> -S smtp-auth=login \
> -S smtp-auth-user="abc#domain.com" \
> -S smtp-auth-password="xyz123" \
> -S ssl-verify=ignore \
> -S nss-config-dir=/etc/pki/nssdb/ \
> abc#domain.com
Resolving host 192.168.XXX.XX . . . done.
Connecting to 192.168.XXX.XX . . . connected.
220 something.domain.com Microsoft ESMTP MAIL Service ready at (time)
>>> EHLO localhost.localdomain
250-something.domain.com Hello [10.XX.XX.XXX]
250-SIZE 37748736
250-PIPELINING
250-DSN
250-ENHANCEDSTATUSCODES
250-STARTTLS
250-X-ANONYMOUSTLS
250-AUTH
250-X-EXPS GSSAPI NTLM
250-8BITMIME
250-BINARYMIME
250-CHUNKING
250 XRDST
>>> STARTTLS
220 2.0.0 SMTP server ready
Error in certificate: Issuer certificate is invalid.
Comparing DNS name: "something"
Comparing DNS name: "something.domain.com"
Comparing common name: "something"
host certificate does not match "192.168.XXX.XX"
SSL parameters: cipher=AES-128, keysize=128, secretkeysize=128,
issuer=CN=something
subject=CN=something
>>> EHLO localhost.localdomain
250-something.domain.com Hello [10.XX.XX.XXX]
250-SIZE 37748736
250-PIPELINING
250-DSN
250-ENHANCEDSTATUSCODES
250-AUTH LOGIN
250-X-EXPS GSSAPI NTLM
250-8BITMIME
250-BINARYMIME
250-CHUNKING
250 XRDST
>>> AUTH LOGIN
334 VXNlcm5hbWU6
>>> YXV0b21hdGlvbl90ZXN0QG5ld3Rlc3QuY29t
334 UGFzc3dvcmQ6
>>> SW5mb3N5czEyMw==
535 5.7.3 Authentication unsuccessful
smtp-server: 535 5.7.3 Authentication unsuccessful
"/home/user-group/user/dead.letter" 11/398
. . . message not sent.
I checked for my credentials, SMTP address for multiple times, they are correct, stil it is showing
smtp-server: 535 5.7.3 Authentication unsuccessful
I am not the root user
Thanks in advance for any suggestions, idea, which will make command executable.
I got this, It was the certificate issue, installed SSL certificate for the domain and it works.
GOAL : To fetch list of files occupying more space in unix
using the below command
ssh serverName du /folderName/* | grep -v 'cannot' | sort -nr | head -10
Using sort -nr to consider as numeric and sort in reverse (To get files occupying more space)
Using the grep -v 'cannot' because there is no access to few folders and these lines must be ignored before sorting
Below is the sample output
624 /folder1/folder2/conf
16 /folder1/folder2/error/include
192 /folder1/folder2/error
284 /folder1/folder2/htdocs
264 /folder1/folder2/icons/small
du: cannot read directory `/folder1/folder2/file1': Permission denied
du: cannot read directory `/folder1/folder2/file3': Permission denied
Facing issues with grep and sort commands, as the error messages are not getting filtered
You need to redirect stderr to stdout using 2>&1 so that you can grep out the error messages. You should also escape the wildcard so that it gets expanded on the remote machine, not on the local one.
ssh serverName du /folderName/\* 2>&1 | grep -v 'cannot' | sort -nr | head -10
You don't need the grep if you close stderr.
ssh serverName du /folderName/\* 2>&- | sort -nr | head -10
Note that the wildcard is escaped.
Im trying to extract a line from wget's result but having trouble with it.
This is my wget call:
$ wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html
Output:
--18:24:12-- http://xxx.xxxx.xxxx:15000/myhtml.html
=> `-'
Resolving xxx.xxxx.xxxx... xxx.xxxx.xxxx
Connecting to xxx.xxxx.xxxx|xxx.xxxx.xxxx|:15000... connected.
HTTP request sent, awaiting response...
HTTP/1.1 302 Found
Date: Tue, 18 Nov 2008 23:24:12 GMT
Server: IBM_HTTP_Server
Expires: Thu, 01 Dec 1994 16:00:00 GMT
Location: https://xxx.xxxx.xxxx/siteminderagent/...
Content-Length: 508
Keep-Alive: timeout=10, max=100
Connection: Keep-Alive
Content-Type: text/html; charset=iso-8859-1
Location: https://xxx.xxxx.xxxx//siteminderagent/...
--18:24:13-- https://xxx.xxxx.xxxx/siteminderagent/...
=> `-'
Resolving xxx.xxxx.xxxx... failed: Name or service not known.
if I do this:
$ wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html | egrep -i "302" <br/>
It doesnt return me the line that contains the string. I just want to check if the site or siteminder is up.
The output of wget you are looking for is written on stderr. You must redirect it:
$ wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html 2>&1 | egrep -i "302"
wget prints the headers to stderr, not to stdout. You can redirect stderr to stdout as follows:
wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html 2>&1 | egrep -i "302"
The "2>&1" part says to redirect ('>') file descriptor 2 (stderr) to file descriptor 1 (stdout).
A bit enhanced version of already provided solution
wget -SO- -T 1 -t 1 http://myurl.com:15000/myhtml.html 2>&1 >/dev/null | grep -c 302
2>&1 >/dev/null will trim off unneeded output. This way egrep will parse only wget`s stderr, what eliminates possibility to catch strings containing 302 from stdout (where html file itself outputted + download proces bar with resulting bytes count e.t.c.) :)
egrep -c counts number of matched strings instead of simply output them. Enough to know how much strings egrep matched.
wget --server-response http://www.amazon.de/xyz 2>&1 | awk '/^ HTTP/{print $2}'
Just to explicate a bit. The -S switch in the original question is shorthand for --server-response.
Also, I know the OP specified wget, but curl is similar and defaults to STDOUT .
curl --head --silent $yourURL
or
curl -I -s $yourURL
The --silent switch is only needed for grep-ability: (-s turns off progress % meter)
I found this question trying to scrape response codes to large lists of URLs after finding curl very slow (5+s per request).
Previously, I was using this:
curl -o /dev/null -I --silent --head --write-out %{http_code} https://example.com
Building off Piotr and Adam's answers, I came up with this:
wget -Sq -T 1 -t 1 --no-check-certificate --spider https://example.com 2>&1 | egrep 'HTTP/1.1 ' | cut -d ' ' -f 4
This has a few bugs e.g. redirects return 302 200, but overall is greased lightning in comparison.