We've a monthly process where we receive input files from the source systems and create an extract to an external system - but the problem is that we dont know on which day of the month we would receive the input files.
Job that reads data from the input files and loads the Staging environment.
Job that loads data from the Staging to the extract tables - this job would load the extract tables ONLY when we have a token file ready ("Stage ready file") which signifies that the data in the Staging env is "clean".
Job that copies data from the extract tables into the external system. Needs to run only when there's data to be copied in the extract tables.
So the sequence would be
Job A --> Job B --> Job C
Please note that no scripting is allowed in Job C.
This is how I have planned it:
Job A would run everyday looking for the source files and would load the staging env. Job B would have a TIDAL dependency to kick off only when the token file ("Stage ready file") is available. Job C would have a TIDAL dependency on Job B to "Complete Normally".
1. Stage ready file would be available only once in a month, but all the jobs would be kicked off daily. Thus, Job B would be in the "Waiting for dependencies" status looking for the "Stage ready file" on Day 1 for a time window - beyond this window the job would time out. Do we have some pseudo script/sample indicative commands which can update the status of the TIDAL from "Timeout" to "Completed Normally"?
Do you have any other ideas to implement this scenario?
Define a job action which is just a call to the tesmcmd job (see example above) to set the status. Then define a job event with the trigger "Job completed Abnormally" or whatever condition suits. For example we use "Job rerun would exceed max reruns" for some of ours. Just pick from the drop-down in the Job Event definition window. Select the "set status" action you just defined from the list of associated actions and the associate the job.
All done. When the condition is met ie the job times out/hits its rerun limit the event triggers the action which calls the job to see the output status. You need to feed it the job ID as a variable so it sets the status of the correct job of course.
The way I approached this scenario is very similar to the one you describe above, although most of our processes are on a set schedule we do have a few which run once the files show up.
We have file events set up to run 24/7 watching for certain files in certain folders on our SFTP and when the files show up, they're automatically moved to a separate file staging folder on the SFTP and trigger an email notification.
We schedule some of our processes to run daily, but to have a 'File Dependency' where they will only run once the correct file shows up in the file staging folder on the SFTP.
Once that file shows up, the process will start automatically (download files, load files to stage, load files to data store, and in some cases extract data from our data store and upload back to the SFTP).
If you're 100% sure that your process will run once/month and you don't want to worry about Timeout issues, I would schedule your process to run on the first day of every month, and in each individual job, on the 'Options' tab in Job Definition, uncheck the box for 'Disable carryover'
This will cause your job to carry over every day from the first day of the month until it completes successfully, and it will only re-run on the first of the next month.
There are a number of ways to handle the scenario, there are some great suggestions above and you woudl have to decide which method works best for you schedule operations
Some alternatives I can think of as well
To make event driven you could just use a file event as well to insert the job group ad-hoc with A+B+C in it when the file shows up.
You could keep what you have but change the timeout event to an event+action that sets the status to skipped instead of Timeout (skip could be considered an "ok" status) when the "job not finished by the end of its time window"
You may need additional trap jobs to make sure you get at least one file a month...