How to get Plink command response in a vbscript variable? - unix

I am checking for the number of files I have in a Unix Directory, and I am trying to get that number in a VBScript variable.
My code:
set oShell = CreateObject("WScript.Shell")
oShell.Run "C:\PLINK.EXE -ssh user#host -pw abc find /my/files -name '*333*' | wc -l > C:\files\res.txt"
set oShell = Nothing
The above code didn't write in the .txt file. I was thinking that i'll read the .txt file using vbscript and get the count.
Is there a direct way to to get the count returned from wc -l in a vbscript variable.
thank you.

You need a shell (%comspec%) to get shell features like redirection (cf. here). So change
oShell.Run "C:\PLINK.EXE -ssh user#host -pw abc find /my/files -name '*333*' | wc -l > C:\files\res.txt"
to
oShell.Run "%comspec% /c C:\PLINK.EXE -ssh user#host -pw abc find /my/files -name '*333*' | wc -l > C:\files\res.txt"
after you have checked that
C:\PLINK.EXE -ssh user#host -pw abc find /my/files -name '*333*' | wc -l > C:\files\res.txt
'works' from a console.
In case of trouble study the docs for .Run (parameters, return value) and .Exec (Stdout/Stderr capturing) and simplify (e.g. no | wc -l).

Related

redirecting output of command run under xargs separately per-file

I'm trying to dump a load of sqlite tables to .csv format from the cmd line using xargs, i.e.
find . -name "*.db" -print0 \
| xargs -0 -I {} sqlite3 -header -csv {} "select * from pulse_data;" > {}.csv
For some reason its writing to {}.csv rather than each .db file in turn.
Is there a problem with using the {} notation more than once?
If I run it without the redirect it seems to work as expected, i.e.
find . -name "*.db" -print0 \
| xargs -0 -I {} sqlite3 -header -csv {} "select * from pulse_data;"
just prints all the .db tables to stdout as expected in .csv format
How can i get it to redirect to a suitably named file, i.e. <file-name>.db.csv?
xargs doesn't start a shell, so you can't do shell redirections from it (unless your command is explicitly xargs sh -c ...). In the code given in the question, >{} is executed by the parent shell before xargs is even started -- it isn't aware of the redirection attempt at all.
Since you aren't doing anything here that significantly benefits from xargs (like parallelism), the cost of launching shells under it isn't justified. Just read the list from your outer shell.
while IFS= read -r -d '' filename; do
sqlite3 -header -csv "$filename" "select * from pulse_data;" >"$filename.csv"
done < <(find . -name '*.db' -print0) # See footnote 1
If you really want to use xargs for some reason:
find . -name '*.db' -print0 \
| xargs -0 sh -c '
for filename; do
sqlite3 -header -csv "$filename" "select * from pulse_data;" >"$filename.csv"
done' _ {} +
Footnote 1: Note that <() is a feature not present in POSIX sh -- but ksh, zsh and bash will all have it; if you need compatibility with /bin/sh, you can pipe from find to the while loop instead, but be aware that this comes with some caveats.

Line count in UNIX by reading file name from another file

I have a file FILE1.TXT. It contains only one file name FILE2.TXT.
How will I find the record count / line count of FILE2.TXT using only FILE1.TXT? What I have already tried is:
cat FILE1.TXT | wc -l
But the above command did not work.
Actually, I need to display the output as below:
File name is FILE2.TXT and the count is 2.
What I have already tried is (using the below statement inside a script file):
echo "File name is "`cat FILE1.TXT`" and the count is " `wc -l < $(cat FILE1.TXT)`
But the above command did not work and gave error
syntax error at line 1: `(' unexpected
For a POSIX-compliant shell:
wc -l $(cat FILE1.txt)
or, with Bash:
wc -l $(<FILE1.txt)
These will both report the file name (but will work if there are multiple file names in FILE1.txt). If you don't want the file name reported (but there's only one name in the file), you could use:
wc -l < $(cat FILE1.txt)
wc -l < $(<FILE1.txt)
file=$(cat FILE1.txt | grep -o "FILE2.txt")
cat "$file" | wc -l

Solaris unix, c-shell, redirecting xargs executed command output

Have no choice about the c-shell. It's what we use here.
So I want to parse through current directory and all sub-directories looking for files of form *.utv and egrep each to look for a specific account number in the file.
I tried something like this:
egrep -l "ACCOUNT NO: +700 " `find . -name "*.utv" ` | more
but got "Too many words from `` " message.
So using xargs because apparently I'm getting too many file names passed back to egrep command-line.
When I do this:
find . -name "*.utv" | xargs -n1 egrep -i -l '"ACCOUNT NO: +700 "' {} >&! /home/me/output.txt
"ps -ef" command shows:
% ps -ef | egrep -i "myuserid"
myuserid 20791 22549 0 18:19:38 pts/20 0:00 find . -name *.utv
myuserid 20792 22549 0 18:19:38 pts/20 0:00 xargs -n1 egrep -i -l "ACCOUNT NO: +700 "
myuserid 22774 20792 1 18:21:13 pts/20 0:04 egrep -i -l "ACCOUNT NO: +700 " ./01/130104_reportfile.utv
%
But I get no output in the "output.txt" file.
If I run the egrep part by hand in the same directory, I get a list of file names containing the account 700 string.
I'm sure it's just a matter of grouping, quoting proper, and/or having the redirect in the right place, but after quite a lot of trial-and-error (and searching here) I'm still not getting anywhere.
Any suggestions?
You only need either single quotes or double quotes (but not both) around the search, as in your original command:
find . -name "*.utv" | xargs -n1 egrep -i -l "ACCOUNT NO: +700 " {} >&! /home/me/output.txt
find . -name "*.utv" | xargs -n1 egrep -i -l 'ACCOUNT NO: +700 ' {} >&! /home/me/output.txt
I'd also lose the -n1, the -i and the {} from the command line too. A trick to always get file names listed is to specify /dev/null as a name, but the -l also does the job:
find . -name "*.utv" | xargs egrep -l 'ACCOUNT NO: +700 ' >&! /home/me/output.txt
And you need to enlighten the powers that be that C shell is not good for programming. And you can always add exec /bin/bash -l to your .login script (or use /bin/ksh instead of /bin/bash). I simply wouldn't have any truck with "You cannot use a sane, civilized shell" rules.

Problem with plink output

I'm using plink to run a command on a Unix remote machine.
The command is:
ls -1trd testegrep.txt |tail -1 |xargs tail -f| grep 's';
The way I'm sending this command is by using a file with a set of commands like:
plink.exe -ssh -t -l user -pw pwd tst.url.pt -m commands.out
When I run the command this way the plink does not receive any input. It seems that is waiting for input.
But if I run:
plink.exe -ssh -t -l user -pw pwd tst.url.pt "ls -1trd testegrep.txt |tail -1 |xargs tail -f| grep 's';"
I get the expected result.
I'm not using the plink with a file with the command because I choose so. I'm using a test automation software that allows me to run tests on remote hosts and this is the way the tool works.
Any thoughts on what is going wrong?
I tested the command you provided and it worked without problems.
Maybe the problem is related to:
The server's host key is not cached in the registry.
The path to the file is not correct.
The file is empty.
include server hostkey
most importantly, you need to include the unix profile using the -m paramater
You can include all your commands in the same file where the profile is kept also.
$Output = ((plink.exe -hostkey hostkey -l UNAME -i SSHKEY -P 22 -ssh server -batch -m PROFILE) | ? {$_ -ne ""})

Command passed as argument to shell script

I want to pass a command to a shell script. This command is a grep command. While executing I am getting the following errors, please help:
myscript.sh "egrep 'ERROR|FATAL' \*20100428\*.log | grep -v aString"
myscript.sh is a simple script:
#!/bin/ksh
cd log
$1
the errors are:
egrep: can't open |
egrep: can't open grep
egrep: can't open -v
egrep: can't open aString
Error is because egrap sees |, grep, -v and aString as arguments.
try this:
eval $1
You can call sh -c $1 to invoke the first argument as commands in a new shell so that the shell special characters will be expanded.

Resources