How to fix error message in tcl script having command [exec bjobs] when no jobs are running? - unix

when I am running a Tcl script that contains the following lines:
set V [exec bjobs ]
puts "bjobs= ${V}"
When jobs are present it's working properly but, no jobs are running it is showing an error like this:
No unfinished job found
while executing
"exec bjobs "
invoked from within
"set V [exec bjobs ]"
How to avoid this error? Please let me know how to avoid this kind of errors.

It sounds to me like the bjobs program has a non-zero exit code in this case. The exec manual page includes this example in a subsection WORKING WITH NON-ZERO RESULTS:
To execute a program that can return a non-zero result, you should wrap
the call to exec in catch and check the contents of the -errorcode
return option if you have an error:
set status 0
if {[catch {exec grep foo bar.txt} results options]} {
set details [dict get $options -errorcode]
if {[lindex $details 0] eq "CHILDSTATUS"} {
set status [lindex $details 2]
} else {
# Some other error; regenerate it to let caller handle
return -options $options -level 0 $results
}
}
This is more easily written using the try command, as that makes it
simpler to trap specific types of errors. This is done using code like
this:
try {
set results [exec grep foo bar.txt]
set status 0
} trap CHILDSTATUS {results options} {
set status [lindex [dict get $options -errorcode] 2]
}
I think you could write this as:
try {
set V [exec bjobs ]
} trap CHILDSTATUS {message} {
# Not sure how you want to handle the case where there's nothing...
set V $message
}
puts "bjobs= ${V}"

if {[catch {exec bjobs} result]} {
puts "bjobs have some issues. Reason : $result"
} else {
puts "bjobs executed successfully. Result : $result"
}
Reference : catch

Note carefully in the exec man
page:
If any of the commands in the pipeline exit abnormally or are killed or
suspended, then exec will return an error [...]
If any of the commands
writes to its standard error file and that standard error is not
redirected and
-ignorestderr is not specified, then exec will return an
error.
So if bjobs returns non-zero or prints to stderr when there are no jobs, exec needs catch or try as Donal writes.

Related

zsh: Do I need to close file descriptors?

I use the following code to both output something to stdout, and pipe it to a program:
function example() {
local fd1
{
exec {fd1}>&1
{ echo hi >&$fd1 } | true
} always { exec {fd1}>&- }
}
I am wondering if I can safely drop always { exec {fd1}>&- }. fd1 goes out of scope after the function finishes anyways.
You need to keep always { exec {fd1}>&- }. If you get rid of that, the variable containing the file descriptor will go out of scope, but the file descriptor won't be closed, resulting in leaking it. You can see this by doing ls -l /proc/$$/fd before and after running your function without that line. Each run of the function will permanently add another FD to that list. Eventually, you'll run out of file descriptors and won't be able to open any new ones, which will break things.

How to get exit status of R script run in shell script

Suppose I'm running a Rscript from inside this shell script
#!/bin/bash
RES=$(./abc.R 100)
r_status=echo $?
There is some code in abc.R which stops its execution
#!/usr/bin/env Rscript
...
...
if(nrow(status) == 0)
{ stop("The list id is not present in requests table. Please check.") } else if (status != 'COMPLETED')
{ stop("The list is not in COMPLETED state. Please check.")}
...
...
I am not able to capture the exit status of abc.R in my shell script. It stops R execution and even quits from the shell script to the prompt.
Is there any way I can capture R's exit status.
Just run the script you want.
make sure it returns the correct exit status when finishing its run.
This should work:
#!/bin/bash
./abc.R 100
if [ $? == 0 ]; then
echo "Your script exited with exit status 0"
exit 0
see more here:
http://tldp.org/LDP/Bash-Beginners-Guide/html/sect_08_02.html

Capture and check bteq return code in Unix

I'm developing a script which in turn invokes several other scripts (.ksh). And basically when one of them fail they shouldn't proceed to the next one. So I tried checking for the return code in one script that involves bteq (Basic Teradata Query) session. Please find below the scenario:
bteq <<EOF!
.run file ${TGTRUNFILEN} ;
.maxerror 1;
.set width 245;
...
...
sel * from table ;
.if ACTIVITYCOUNT <> 0 then .GOTO QUIT
.os mail command "error msg"
exit 1;
.LABEL QUIT
.quit;
EOF!
echo $rcode
rcode=$?
if [[ $rcode != 0 ]]
then
echo "$0: Insufficient Perm Space : username " >&2
exit 4
fi
Here,the script fails and I can see the log saying failed with return code 1, but why isn't the text "$0:Insufficient Perm Space : Username" displayed. I think it exits the entire script, but I need this fixed somehow.
Can someone kindly help me on this?
Hi Thanks a ton for responding.I found a way to overcome this.I just added 'set' command like this.
set +e
bteq <<EOF!
...
...
EOF!
rcode=$?
set -e
Works fine for me.
Cheers

Time command equivalent in PowerShell

What is the flow of execution of the time command in detail?
I have a user created function in PowerShell, which will compute the time for execution of the command in the following way.
It will open the new PowerShell window.
It will execute the command.
It will close the PowerShell window.
It will get the the different execution times using the GetProcessTimes function function.
Is the "time command" in Unix also calculated in the same way?
The Measure-Command cmdlet is your friend.
PS> Measure-Command -Expression {dir}
You could also get execution time from the command history (last executed command in this example):
$h = Get-History -Count 1
$h.EndExecutionTime - $h.StartExecutionTime
I've been doing this:
Time {npm --version ; node --version}
With this function, which you can put in your $profile file:
function Time([scriptblock]$scriptblock, $name)
{
<#
.SYNOPSIS
Run the given scriptblock, and say how long it took at the end.
.DESCRIPTION
.PARAMETER scriptBlock
A single computer name or an array of computer names. You mayalso provide IP addresses.
.PARAMETER name
Use this for long scriptBlocks to avoid quoting the entire script block in the final output line
.EXAMPLE
time { ls -recurse}
.EXAMPLE
time { ls -recurse} "All the things"
#>
if (!$stopWatch)
{
$script:stopWatch = new-object System.Diagnostics.StopWatch
}
$stopWatch.Reset()
$stopWatch.Start()
. $scriptblock
$stopWatch.Stop()
if ($name -eq $null) {
$name = "$scriptblock"
}
"Execution time: $($stopWatch.ElapsedMilliseconds) ms for $name"
}
Measure-Command works, but it swallows the stdout of the command being run. (Also see Timing a command's execution in PowerShell)
If you need to measure the time taken by something, you can follow this blog entry.
Basically, it suggest to use the .NET StopWatch class:
$sw = [System.Diagnostics.StopWatch]::startNew()
# The code you measure
$sw.Stop()
Write-Host $sw.Elapsed

Boost-build/BJam language - checking the value of a flag

I need to edit a .jam file used by boost-build for a specific kind of projects. The official manual on BJAM language says:
One of the toolsets that cares about DEF files is msvc. The following line should be added to it. flags msvc.link DEF_FILE
;
Since the DEF_FILE variable is not used by the msvc.link action, we need to modify it to be: actions link bind DEF_FILE { $(.LD) ....
/DEF:$(DEF_FILE) .... } Note the bind DEF_FILE part. It tells bjam to
translate the internal target name in DEF_FILE to a corresponding
filename in the link
So apparently just printing DEF_FILE with ECHO wouldn't work. How can it be expanded to a string variable or something that can actually be checked?
What I need to do is to print an error message and abort the build in case the flag is not set. I tried:
if ! $(DEF_FILE)
{
errors.user-error "file not found" ;
EXIT ;
}
but this "if" is always true
I also tried putting "if ! $_DEF_FILE {...}" inside the "actions" contained but apparently it is ignored.
I am not sure I understand the global task you have. However, if you wanted to add checking for non-empty DEF_FILE -- expanding on the documentation bit you quote, you need to add the check in msvc.link function.
If you have a command line pattern (specified with 'actions') its content is what is passed to OS for execution. But, you can also have a function with the same name, that will be called before generating the actions. For example, here's what current codebase have:
rule link.dll ( targets + : sources * : properties * )
{
DEPENDS $(<) : [ on $(<) return $(DEF_FILE) ] ;
if <embed-manifest>on in $(properties)
{
msvc.manifest.dll $(targets) : $(sources) : $(properties) ;
}
}
You can modify this code to additionally:
if ! [ on $(<) return $(DEF_FILE) ] {
ECHO "error" ;
}

Resources