Stopping Autosys file watcher job at specific time - autosys

I am working on below requirement to set up Autosys jobs.
Requirement:
Continuously monitor a location for file arrival with interval of 3 minutes between 8 AM to 8 PM
Trigger the import job when the file is received and continue to monitor the file till 8 PM
File monitoring needs to be stopped at 8 PM
What I implemented:
BOX with run_window: 8:00 - 20:00 , start_mins: "05,10,15,20,25,30,35,40,45,50,55"
File watcher job - term_run_time: 4 , box_terminator: y
Import job: if file watcher runs successfully then import job runs
With the above set up, I am able to achieve below:
When the file is arrived , File watcher job runs to success and triggers import job, box completes as green. same jobs will be loaded as part of the box next scheduled loading time which is for every 5 minutes in this case
If the file does not come, then file watcher job get's auto terminated and terminates the box as well and get loaded as part the box next scheduled loading time (But since I am terminating the file watcher job and box, so it might cause unnecessary alerts in Prod)
if I don't terminate file watcher job for every 4 minutes, it wont stop at 8 PM at the night which I do not want.
so I am just wondering if there is any solution to stop my file watcher job at 8 PM so that I don't have to terminate the file watcher job/box.

The termination after 4 minutes of runtime is the easiest solution.
I would probably take the FT outside of the box so that way your box is not running, and on the success of FT start the box that contains the import.
That would be a better solution.

Related

AutoSys box success condition - Successful as long as sub-box was not a failure

AutoSys 11.3
I have a top-level box (Box-A) that runs every 15 minutes. It has multiple jobs plus Box-B.
Box-B (a member of Box-A), contains a file watcher job, and a script job.
Box-B will terminate after 14 minutes, if still running (IE, file watcher didn't find the file).
I would like Box-A to show ,uccess as long as Box-B doesn't fail. Put another way, if Box-B, succeeds or is terminated. AND all the other jobs Succeed, Box-A should report Success.
Is this possible?

Airflow schedule single dag for multiple schedules throughout the day

I want to have a single DAG to download data from an FTP, I don't need all the data in the FTP just certain files. The files get uploaded daily at certain times throughout the day and I want to retrieve these files shortly after they are available on the FTP site.
Ex FTP schedule:
/Data/US/XASE/yyyymmdd.csv #uploaded daily at 9:00 PM UTC
/Data/EU/TRWB/yyyymmdd.csv #uploaded daily at 1:00 PM UTC
...
/Data/EU/XEUR/yyyymmdd.csv #uploaded daily at 11:00 AM UTC
How can I set the scheduler in the dag so that I can copy the data from the FTP site as they are uploaded and not have a separate dag for each upload time?
I think you have three options for scheduling here.
Option 1
You run exactly at 11AM,1PM,9PM UTC with the following schedule 0 11,13,21 * * *. Or maybe 5 mins after the full hour to add some buffer (5 11,13,21 * * *).
Option 2
You run the DAG more regularly and check if the files are available and then download them within the Task. This makes sense if there is a higher chance that the file upload is delayed.
For example */10 10-22 * * * would run every 10 minutes between 10:00-22:00.
Option 3
You schedule a DAG once per day (#once) and then work with TimeDeltaSensor. I think this option is least preferable as you have a lot of tasks just "waiting" - which can block the execution of other airflow tasks.
Besides that it also depends heavily how you want to handle the download from the FTP itself.
I guess you could create a task for every file to download per day and put a task based on BranchPythonOperator in front to avoid trying to download the same file multiple times.
Or you put the whole logic into a PythonOperator including a logic that just downloads certain files based on execution_date.

How to Build Dynamic Queues using Apache Air Flow?

I have just started to explore Apache Airflow.
Is there any way to run a job that will look into the running DAGS and move those tasks in those DAGS to new DAG by creating them and adding those tasks in it.
For Example : DAG A has four tasks, 4th one has been waiting from 7 hours to start - Goal is to create new DAG and move that tasks automatically to new DAG.
Scenario : Actually we have around 40 VM, and each job time varies with its own instance. For Example : Task A will take 2 hours today but might take 12 Hours tomorrow in the same DAG. What i need is to move that task to other DAG if the waiting time of any task exceed certain time to run on other VM instantly.
The main benefi is to keep all the task waiting time minimum as possible by building dynamic DAGs

how to create a wait job in informatica

My requirement is to create a job in informatica which will run for every 15 min and look for a status column in abc table.If it is “Approved” THEN It will exit and kick off the rest of the jobs.
If the status is not approved it will not do anything and run after 15 min.This process wil continue until we have a approval status.
So, No matter what happens in the above two scenarios,This process will run in every 15 minutes.
I have worked on the same requirement in unix using loops and conditional statments but I am not sure how this can be achieved using informatica.Could you please help me on this.
Regards,
Karthik
I would try adding a scheduler that runs every 15 minutes. The best way that I've found to "loop" sessions in Informatica is:
run the session once, check if it failed using conditional links
if it did fail, run a timer task for an amount of time (a minute, an hour, whatever)
then try to run the same session again by copying and pasting the session up ahead of the timer task, and repeat a few times as necessary.
So if you added a scheduler into the mix, you could set the scheduler to have the workflow run every 15 minutes, and have the timer tasks halt the workflow for 4 or 5 minutes each. Then you could use SESSSTARTTIME function in some pre/post-session task to determine when the scheduler will fire off again and simply abort the workflow before that time.

TFS2010 Team build - waiting for an "InvokeProcess" step to complete

I am performing a database restore as part of our TFS 2010 Team build. Since a number of databases are being restored, I am using a batch file which is invoked via the InvokeProcess activity.
I have a number of issues that I am uncertain about:
1. Does the TFS wait for all the command in the batch file to complete or move to the next activity as soon as kicking the InvokeProcess?
2. Is there a way to have the build process wait for successful completion of the batch command?
I am using it as follows:
The FileName property of InvokeProcess has "c:\windows\system32\cmd.exe"
The Arguments property has the full path of my batch file.
Yes the InvokeProcess will wait for the external command to finish.

Resources