Autosys - Get Jobs list on a specific computer - autosys

I try to get the list of all the jobs running on a specific client.
When running autorep -J ALL -q I get the following output:
/* ----------------- ### ----------------- */
insert_job: ### job_type: CMD
box_name: ###
command: ###
machine: MACHINE X
owner: autosys
permission:
date_conditions: ###
condition: ###
description: ###
box_terminator: 1
alarm_if_fail: 0
application: ###
/* ----------------- ### ----------------- */
insert_job: ### job_type: CMD
box_name: ###
command: ###
machine: MACHINE Y
owner: autosys
permission:
date_conditions: ###
condition: ###
description: ###
box_terminator: 1
alarm_if_fail: 0
application: ###
...
As you can see, is displayed here the list of ALL the jobs for ALL the clients
What I expect to get is an output as following :
/* ----------------- ### ----------------- */
insert_job: ### job_type: CMD
box_name: ###
command: ###
machine: MACHINE X
owner: autosys
permission:
date_conditions: ###
condition: ###
description: ###
box_terminator: 1
alarm_if_fail: 0
application: ###
/* ----------------- ### ----------------- */
insert_job: ### job_type: CMD
box_name: ###
command: ###
machine: MACHINE X
owner: autosys
permission:
date_conditions: ###
condition: ###
description: ###
box_terminator: 1
alarm_if_fail: 0
application: ###
...
Unfortunately the commands autorep -J ALL -q -m MACHINE X doesn't work but give me the following output :
/* ----------------- MACHINE X ----------------- */
insert_machine: MACHINE X
type: a
factor: 1.00
port: 7520
node_name: MACHINE X
agent_name: WA_AGENT
encryption_type: DEFAULT
opsys: windows
character_code: ASCII
I guess that this is the JIL format to add a machine, so what no I expect to get.
Do you know if what I try to do is possible using autosys commands only or if I have to parse the first output trough some regex to finally obtain what I would like to ?

I have created a Autosys Jill parser , using that you can have required info in csv file .After that you can filter required information
first you must have a file which contains required columns to be extracted from jill like
$]cat Jill_Columns
insert_job
job_type
box_name
watch_file
watch_interval
command
date_conditions
machine
owner
permission
condition
days_of_week
exclude_calendar
start_times
description
std_out_file
std_err_file
alarm_if_fail
profile
application
and Jill file which needs to be parsed to csv
Here is the script
Command to run in unix box
sh JillToCsv.sh Your_Jill_Columns Yours_Jill_file
#!/bin/bash
Usage()
{
echo "----------------------------------------------------------------------------"
echo " "
echo " "
echo " "
echo "This script takes two parameter "
echo "First is the file which conatins column names like below for Jill information "
echo "cat Jill_ColumnOrder.txt"
echo "insert_job
job_type
box_name
command
date_conditions
machine
owner
permission
condition
days_of_week
exclude_calendar
start_times
description
std_out_file
std_err_file
alarm_if_fail
profile
application"
echo " "
echo " "
echo " "
echo "Second is the Jill File which needs to be parsed as csv "
echo " "
echo " "
echo " "
echo "----------------------------------------------------------------------------"
}
if [ $# -eq 2 ]
then
echo "File which contains column names : $1"
echo "Input Jill File : $2"
else
echo "Please pass manadatory parameters"
Usage
exit 1
fi
Jill_ColumnOrder=$1
tfile=$2
dos2unix $tfile
final=${tfile}_Parse.csv
rm -f $final
heading=`cat $Jill_ColumnOrder|sed 's/$/#Akhilesh#/g'|sed '$ s/#Akhilesh#$//g'|tr -d '\n'`
line=
filler=1
no_columns=`cat $Jill_ColumnOrder|wc -l`
while [ $filler -lt $no_columns ]
do
filler=`expr $filler + 1`
line=$line`echo -n "#Akhilesh#"`
#echo $line
done
#echo "$heading"
#echo " "
#echo "$line"
echo "$heading">$final
count=1
cat $tfile|sed -r 's/job_type/\njob_type/g'|sed '/^$/d'|sed -r 's/\s+/ /g'|while read kk
do
echo "$kk"|sed -r 's/^[ ]+//g'|grep '\/\* -----------------' # >/dev/null
if [ $? -eq 0 ]
then
count=`expr $count + 1`
echo "$line">>$final
else
PP=`echo $kk|cut -d ':' -f1|tr -d ' '`
#echo "PP : $PP"
val=`echo $kk|cut -d ':' -f2-|sed -r 's/^[ ]+//g'|tr -d '"'|tr -d '\n'`
#echo $val
val=\"$val\"
field=`grep -nw $PP $Jill_ColumnOrder|cut -d':' -f1|tr -d '\n'`
#echo "field : $field"
if [ -z $field ]
then
echo "$PP not there in column list"
else
awk -F'#Akhilesh#' -v OFS=#Akhilesh# -v var="$val" -v var1=$count '{if(NR== var1) $'$field' = var;print}' $final > tmp.parse && mv tmp.parse $final
fi
fi
done
cat $final|sed 's/#Akhilesh#/,/g' > tmp.parse && mv tmp.parse $final

autorep does not support that sorting as of yet. You will need to parse out the output.
Dave

I have finally created a script that does it for me :
#!/bin/ksh
autorep -j ALL -q > alljobs.jil
mkdir /tmp/autosystemp
i=0
while IFS=\| read -r "line"; do
if [ `echo $line | grep '/* ------' | wc -l` -eq 1 ]; then
i=$((i+1))
fi
echo "$line" >> "/tmp/autosystemp/f$i"
done < alljobs.jil
rm tempo/f0
for file in /tmp/autosystemp/*; do
while read line; do
if [ `echo $line | grep "machine: $1" | wc -l` -eq 1 ]; then
cat $file >> "jobs-on-$1.jil"
fi
done < $file
done
rm -rf /tmp/autosystemp
To use it just pass the machine name as an argument to the script

Related

top: failed tty get in linux

#!/bin/bash
echo """"""""
now=`date "+%m/%d/%y:%H:%M:%S"`
echo -e "\n\n $now ---CPU/MEM output--- \n\n"
cat /proc/cpuinfo
echo -e "\n\n"
cat /proc/meminfo
echo -e "\n\n"
echo `ulimit -a`
now=`date "+%m/%d/%y:%H:%M:%S"`
echo -e "\n\n $now ---top output--- \n\n"
export TERM=xterm
top -n 1 -U username
top -H -n 1 -U username
echo -e "\n END OF Script"
Tried using
export Term
still getting error as failed to get TTY.
Please advice.

SED command use for writing back to the same file

I have the below code which adds Logger.info line after every function definition which I need to run on a python script which is the requirement.
The only question is this has to be written back to the same file so the new file has all these looger.info statements below each function definition.
e.g. the file abc.py has currently below code :
def run_func(sql_query):
return run_func(sql_query)
and the code below should create the same abc.py file but with all the logger.info added to this new file
def run_func(sql_query):
LOGGER.info (''MIPY_INVOKING run_func function for abc file in directory'
return run_func(sql_query)
I am not able to write the sed in this file to the new file (with same file name) so that the original file gets replaced by same file name and so that I have all the logger.info statements in there.
for i in $(find * -name '*.py');
do echo "#############################################" | tee -a auto_logger.log
echo "File Name : $i" | tee -a auto_logger.log
echo "Listing the python files in the current script $i" | tee -a auto_logger.log
for j in $(grep "def " $i | awk '{print $2}' | awk -F"(" '{print $1}');
do
echo "Function name : $j" | tee -a auto_logger.log
echo "Writing the INVOKING statements for $j function definition" | tee -a auto_logger.log
grep "def " $i |sed '/):/w a LOGGER.info (''INVOKING1 '"$j"' function for '"$i"' file in sam_utilities'')'
if [ $? -ne 0 ] ; then
echo " Auto Logger for $i filename - Not Executed Successfully" | tee -a auto_logger.log
else
echo "Auto Logger for $i filename - Executed Successfully" | tee -a auto_logger.log
fi
done
done

Unix Loop if Condition and exit comand

I am facing an issue, I have to delete files from some folders given in Path.lst,
The entire script is working fine but when some wrong path is given in Path.lst the script does exits out of the loop and perform no operation on the next paths.
But the last line
echo -e "\n ENDING SCRIPT SUCCESSFULLY ON `date` " >> $LOG_FILE
gets executed because exit 1 is not working in this part
if [ ! -d $path ]
then
echo -e "\nERROR :$path IS INVALID." >> $LOG_FILE
echo -e "\nENDING SCRIPT WITH ERRORS ON `date`" >> $LOG_FILE
exit 1
---------------------------------------------------------------------------------------
THE SCRIPT IS LIKE :
echo -e "\nSTARTING SCRIPT ON `date`">> $LOG_FILE
if [ $1 -gt 0 ]
then
DAYS_BFOR="$1"
else
echo -e "\nERROR :Please pass a single positive integer to the script" >>$LOG_FILE
echo -e "\nENDING SCRIPT WITH ERRORS ON `date` " >> $LOG_FILE
exit
fi
cat Path.lis | sed 's|^PATH[0-9]*=||g' | while read path
do
if [ ! -d $path ]
then
echo -e "\nERROR :$path IS INVALID." >> $LOG_FILE
echo -e "\n ENDING SCRIPT WITH ERRORS ON `date` " >> $LOG_FILE
exit 1
else
echo -e "\nFILES DELETED FROM THE "$path" DIRECTORY --" >> $LOG_FILE
find $path -type f -mtime +$DAYS_BFOR -printf "%TY-%Tm-%Td %kKB %p\n" | column -t | sed "s|"$path"||g" >> $LOG_FILE 2>&1
file_count=`find $path -type f -mtime +$DAYS_BFOR | wc -l`
if [ $file_count -ge 1 ]
then
find $path -type f -mtime +$DAYS_BFOR | xargs rm 2>>$LOG_FILE 2>&1
fi
fi
done
echo Exit Status : $?
echo -e "\n ENDING SCRIPT SUCCESSFULLY ON `date`" >> $LOG_FILE
Please help and explain the reason as well.
If you only want the "ENDING SCRIPT SUCCESSFULLY" message to appear if files were successfully deleted, not if an invalid path was given you could just move the last two echo lines up to the end of the else statement like this:
else
echo -e "\nFILES DELETED FROM THE "$path" DIRECTORY --" >> $LOG_FILE
find $path -type f -mtime +$DAYS_BFOR -printf "%TY-%Tm-%Td %kKB %p\n" | column -t | sed "s|"$path"||g" >> $LOG_FILE 2>&1
file_count=`find $path -type f -mtime +$DAYS_BFOR | wc -l`
if [ $file_count -ge 1 ]
then
find $path -type f -mtime +$DAYS_BFOR | xargs rm 2>>$LOG_FILE 2>&1
fi
echo Exit Status : $?
echo -e "\n--------------------------- ENDING SCRIPT SUCCESSFULLY ON `date` ----------------------------------" >> $LOG_FILE
fi
done
If you want to just skip to the next item in the Path.lis file then just remove the exit statement from the first loop. That way it will continue to execute the script until all the lines in the file have been read, and just show an error if the current file is not a valid path.

function to list directory an files doesnt work

1 #! /bin/bash
2
3
4 a=$(ls $#)
5 echo $a
6 for var in $a
7 do
8 if [ -d $var ]
9 then
10 echo "$var is a directory"
11 elif [ -f $var ]
12 then
13 echo "$var is a file"
14 fi
15 done
16
the name of the script is test 2. If i type "sh test2 ." in shell it shows all files that are files and all directorys that are directorys in the current directory. However if I type in "sh test2 ~" in shell it doesnt show anything it just lists the files. Why does it not show the files and directorys in the home directory?
I'm not sure what's the actual problem of your script. It's likely that ls is not giving an output that can be parsed. But here's a version you might want:
#!/bin/bash
shopt -s nullglob ## Prevents patterns from presenting themselves if no match is found.
shopt -s dotglob ## Includes files stating with .
files=()
if [[ $# -eq 0 ]]; then
files=(*)
else
for arg; do
files+=("$arg"/*)
done
fi
echo --------------------
printf '%s\n' "${files[#]}"
echo --------------------
for file in "${files[#]}"; do
if [[ -d $file ]]; then
echo "$file is a directory."
elif [[ -f $file ]]; then
echo "$file is a file."
fi
done
echo --------------------

SAS Unix Shell Script - Print Contents of Table or Macro Variables

I figured it out.
GREPOUT=`grep "NOTE: Table $TABLE created," $LOGFILE | awk '{print $6}'`
NIW=`grep "SYMBOLGEN: Macro variable NIW resolves to" $LOGFILE | awk '{print $0}'`
if [ "$GREPOUT" -gt "0" ]; then
echo "$NIW" |\
$MAILX -s "SUCESSFUL BATCH RUN: $PROG $RPTDATE" $MAILLIST
fi
from the body of the sent email
SYMBOLGEN: Macro variable NIW resolves to 8
My script runs a SAS code and sends out an email after it completes.
I'm looking to print the contents of a table or list of macro variables in the email.
The SAS code has a %put all; statement at the end so all macro variables are listed in the log.
Thanks.
#If it's gotten this far, we can safely grab the number of rows
#of output from $LOGFILE.
GREPOUT=`grep "NOTE: Table $TABLE created," $LOGFILE | awk '{print $6}'`
NIW=`grep "GLOBAL NIW" $LOGFILE | '(print $6)'`
if [ "$GREPOUT" -gt "0" ]; then
#echo "$GREPOUT rows found in $TABLE." |\
echo "$NIW NIW" |\
$MAILX -s "SUCESSFUL BATCH RUN: $PROG $RPTDATE" $MAILLIST
else
echo "$GREPOUT rows found in $TABLE." |\
$MAILX -s "SUCESSFUL BATCH RUN: $PROG $RPTDATE" $MAILLIST
fi

Resources