Autosys command to retrieve jobs names, status only, using job_depends - autosys

I'm using job_Depends to retrieve forecast of jobs during a time period, I'm looking to find jobs OFF_ICE or OFF_HOLD only during a time period.
current query:
job_depends -c -j %2 -F %3 -T %4>>%FileName%
brings out unnecessary(in my case) info., like showing the dependencies, conditions, etc.
Start Dependent
Job Name Status Date Cond? Cond? Jobs?
-------- ------ --------------- ----- ---------
1CCS.UATQA.ACE_EXTRACT.C ON_HOLD Met No Yes
Dependent Job Name Condition
------------------ ---------
1CCS.UATQA.ACE_EXPORT.C SUCCESS(1CCS.UATQA.ACE_EXTRACT.C)
All I need is Job name and current status which are 'ON_ICE' and 'ON_HOLD' jobs only.

I used find and findstr features to remove the unwanted stuff.
Add required words in the below command, according to your req. The below command will remove spaces and new lines too.
type Temp.txt | find /V "ON_ICE" | find /V "Condition" | find /V "-------" | find /V "success" | find /V "______"| find /V ".FW"| find /V "(" |FINDSTR /V /R /C:"^$">FilteredFile.log

Related

msys / mingw command line requires or rejects exe extension based on whether stdout is redirected with ConEmu console

The underlying cause of this problem is described elsewhere, with partial workarounds provided.
For example: stdout is not a tty and stdin is not a tty
An example of a command line I'm having problems with in MSYS2 or MINGW64 environments is this:
# psql -c '\d' | grep public
stdout is not a tty
Here's the output, if issued without piping to grep:
# psql -c '\d'
List of relations
Schema | Name | Type | Owner
--------+----------------+----------+----------
public | tagname | table | postgres
public | tagname_id_seq | sequence | postgres
(2 rows)
In order to redirect stdout, it's apparently necessary to edit the command line, changing psql to psql.exe. This modification does the trick:
# psql.exe -c '\d' | grep public
public | tagname | table | postgres
public | tagname_id_seq | sequence | postgres
This version works, whether or not stdout is redirected:
# psql.exe -c '\d'
List of relations
Schema | Name | Type | Owner
--------+----------------+----------+----------
public | tagname | table | postgres
public | tagname_id_seq | sequence | postgres
(2 rows)
Note that the problem only exists for some programs. For example, /usr/bin/find is indifferent to whether the .exe extension is specified. Also, the cygwin version of psql.exe does not suffer from this limitation.
The workaround of appending .exe could be hidden with an alias if you could always call psql as psql.exe, but there are problems with interactive sessions when called with the extension. (in ConEmu terminal under MSYS_NT-10.0-19042, for example)
So my question is this: is it possible to create a wrapper program (for example, in golang or C, or a shell script) to hide this problem? The goal would be to support optionally redirectable command lines without requiring the .exe extension.
Assuming the wrapper is named "fixtty", it would spawn a command line that could be redirected or not. In other words, neither of these command lines would fail with an error message:
# fixtty psql -c '\d'
# fixtty psql -c '\d' | grep public
A wrapper script (call it fixtty) that hides the problem:
#!/bin/bash
if [ -t 1 ]; then
# stdout is a terminal
"$#"
else
# stdout is a file or pipe
PROG=$1 ; shift
set -- "$PROG.exe" "$#" # append .exe suffix
"$#"
fi
Usage:
$ fixtty psql -c '\d' # unfiltered `stdout`
$ fixtty psql -c '\d' | grep public # filtered
$ fixtty psql # launch interactive session

Autosys command to identify jobs which are ON ICE | ON HOLD

under Migration process, we need to identify the commands/box which are kept ON HOLD/ON ICE. We are doing this activity as we need to delete those unused jobs.
Please let me know the Autosys command to list these unused jobs?
Thanks
If you're doing this manually, then a simple grep/find should work for you, for example:
in Unix:
autorep -wj ALL | grep OI
autorep -wj ALL | grep OH
or in a Windows agent CMD:
autorep -wj ALL | find "OI" or OH for on_hold jobs
However, if you plan to automate this, you can write a simple program that uses the above commands as a base, then parse the output.

grep a log file for specific word that occured on today

I have to search how many times we have received any specific exceptions on current date. I am using below command but it doesn't work.
This command shows ClassCastException or NumberFormatException that occurred until now.
I just wanted to know how many times ClassCastException or NumberFormatException occurred in today's date only.
grep $(date +"%Y-%m-%d") /*.* |grep -ioh "ClasscastException\|NumberFormatException" /logs/*.* | sort | uniq -c | sort -r
grep -ioh "ClasscastException\|NumberFormatException" /logs/*.* | sort | uniq -c | sort -r
Above command gave me no of count for ClassCastException and NumberFormatException in log file for all dates. I just want for today's date count.
The first command should work after removing the /logs/*.* part
grep $(date +"%Y-%m-%d") /∗.∗ |grep -ioh "ClasscastException\|NumberFormatException" /logs/∗.∗ | sort | uniq -c | sort -r
grep works on the files that are given as argument, otherwise, it defaults to stdin.
Since files are supplied as argument to the the 2nd grep in this pipeline, it is discarding the output from the 1st grep and looking for the pattern in all files under log directory again.

Concatenating input to svn list command with output, then pass it to grep

I currently have the following shell command which is only partially working:
svn list $myrepo/libs/ |
xargs -P 10 -L 1 -I {} echo $myrepo/libs/ {} trunk |
sed 's/ //g' |
xargs -P 20 -L 1 svn list --depth infinity |
grep .xlsx
where $myrepo corresponds to the svn server address.
The libs folder contains a number of subfolders (currently about 30 although eventually up to 100), each which contain a number of tags, branches and a trunk. I wish to get a list of xlsx files contained only within the trunk folder of each of these subfolders. The command above works fine however it only returns the relative path from $myrepo/libs/subfolder/trunk/, so I get this back:
1/2/3/file.xlsx
Because of the potentially large number of files I would have to search through, I am performing it in two parallel steps by using xargs -P (I do not have and cannot use parallels). It am also trying to do this in one command so it can be used in php/perl/etc. and avoid multiple sytem calls.
What I would like to do is concatenate the input to this part of the command:
xargs -P 20 -L 1 svn list --depth infinity
with the output from it, to give the following:
$myrepo/libs/subfolder/trunk/1/2/3/file.xlsx
Then pass this to the grep to find the xlsx files.
I appreciate any assistance that could be provided.
If I manage to correctly divine your intention, something like this might work for you.
svn list "$myrepo/libs/" |
xargs -P 20 -n 1 sh -c 'svn list -R "$0/trunk/$1" |
sed -n "s%.*\.xlsx$%$0/trunk/$1/&%p"' "$myrepo"
Briefly, we postprocess the output from the inner svn list to filter to just .xslx files and tack the full SVN path back on at the same time. This way, the processing happens where the repo path is still known.
We hack things a bit by passing in "$myrepo" as "$0" to the subordinate sh so we don't have to export this variable. The input from the outer svn list comes as $1.
(The repos I have access to have a slightly different layout so there could be a copy/paste error somewhere.)

how to delete all files except the latest three in a folder

I have a folder which contains some subversion revision checkouts (these are checked out when running a capistrano deployment recipe).
what I want to do really is that to keep the latest 3 revisions which the capistrano script checkouts and delete other ones, so for this I am planning to run some command on the terminal using a run command, actually capistrano hasn't got anything to do here, but a unix command.
I was trying to run a command to get a list of files except the lastest three and delete the rest, I could get the list of files using the following command.
(ls -t /var/path/to/folder |head -n 3; ls /var/path/to/folder)|sort|uniq -u|xargs
now if I add a rm -Rf to the end of this command it returns me with file not found to delete. so thats obvious because this returns only the name of the folder, not the full path to the folder.
is there anyway to delete these files / folders using one unix command?
Alright, there are a few things wrong with your script.
First, and most problematically, is this line:
ls -t /var/path/to/folder |head -n 3;
ls -t will return a list of files in order of their last modification time, starting with the most recently modified. head -n 3 says to only list the first three lines. So what this is saying is "give me a list of only the three most recently modified files", which I don't think is what you want.
I'm not really sure what you're doing with the second ls command, but I'm pretty sure that's just going to concatenate all the files in the directory into your list. That means when it gets sorted and uniq'ed, you'll just be left with an alphabetical list of all the files in that directory. When this gets passed to something like xargs rm, you'll wipe out everything in that directory.
Next, sort | uniq doesn't need the uniq part. You can just use the -u switch on sort to get rid of duplicates. You don't need this part anyway.
Finally, the actual removal of the directory. On that part, you had it right in your question: just use rm -r
Here's the easiest way I can think to do this:
ls -t1 /var/path/to/folder | tail -n +4 | xargs rm -r
Here's what's happening here:
ls -t1 is printing a list, one file/directory per line, of all files in /var/path/to/folder, ordering by the most recent modification date.
tail -n +4 is printing all lines in the output of ls -t1 starting with the fourth line (i.e. the three most recently modified files won't be listed)
xargs rm -r says to delete any file output from the tail. The -r means to recursively delete files, so if it encounters a directory, it will delete everything in that directory, then delete the directory itself.
Note that I'm not sorting anything or removing any duplicates. That's because:
ls only reports a file once, so there are no duplicates to remove
You're deleting every file passed anyway, so it doesn't matter in what order they're deleted.
Does all of that make sense?
Edit:
Since I was wrong about ls specifying the full path when passed an absolute directory, and since you might not be able to perform a cd, perhaps you could use tail instead.
For example:
ls -t1 /var/path/to/folder | tail -n +4 | xargs find /var/path/to/folder -name $1 | xargs rm -r
Below is a useful way of doing the task.......!!
for Linux and HP-UX:
ls -t1 | tail -n +50 | xargs rm -r # to leave latest 50 files/directories.
for SunOS:
rm `(ls -t |head -n 100; ls)|sort|uniq -u`
Hi I found a way to do this we can use the unix &&
so the command will look like this
cd /var/path/to/folder && ls -t1 /var/path/to/folder | tail -n +4 | xargs rm -r

Resources