I couldn't understand the usage of cmi.exit data model element.
Scenario A: Suppose an activity sets its cmi.success_status = passed and cmi.exit = suspend and then triggers an Exit All navigation request. In this case, what is the value of Objective Satisfied Status of the primary objective of the activity in the next Sequencing Session?
Scenario B: Suppose an activity sets its cmi.success_status = passed and cmi.exit = normal and then triggers an Exit All navigation request. In this case, what is the value of Objective Satisfied Status of the primary objective of the activity in the next Sequencing Session?
In both scenarios, the next sequencing session will start with fresh data because of "Exit All" NAV request. If a SCO sets the value of "cmi.exit" data to "suspend", this will suspend that SCO while the sequencing session is in progress. If you want to retain all data for the whole course, you should use "Suspend All" instead of "Exit All". Notice that "cmi.exit" is related only to the SCO which trigers it, it does not affect other SCOs in the course.
Related
I am working on a "Buy/Sell Item" functionality for an rpg game. If I am correct following actions need to happen in a single transaction
Get player based on playerID from "Players" table
Check if player has enough gold, if so create new item object
Write new item object to "Items" table, where owner is based on pllayerID
Update "Players" table row where owner is playerID, deduct gold based on item cost
If in above players gold changes while transaction is running or step 3 or 4 fail to execute simultaneously transaction should fail.
I've read the docs and can see that DynamoDB has TransactWriteItems and TransactGetItems which seem to be perfect for this, but are separate. Can I somehow use them in a single transaction?
Once you have your playerID, you should be able to merge step 4 into step 2 and then do steps 2-3 within one TransactWriteItems:
UPDATE the value of the gold conditionally - if enough gold is present, deduct the item cost immediately, else condition fails, and so does TransactWriteItems.
If you made it here, there was enough gold, and you can create and write the new object to the Items table.
Edit for explanation as to why this works:
TransactWriteItems groups the actions "Update gold" and "put item" as a single
all-or-nothing operation: either they both succeed or they both fail. And if another operation interferes, they also both fail (no race condition).
For example:
If there is not enough gold, the update fails, so the add item also fails.
If adding the item fails, the changes made to the gold will not be kept (the update will "fail")
If another operation is modifying the gold amount, both fail.
Essentially, it is impossible to either:
subtract the gold, but not add the item
add the item but not subtract the gold
Because they are both within the TransactWriteItems operation, either both happen, or neither does.
You can find an example in Java here. (For your use case, you can simply drop the customer validation and change "product status update" to "player gold update" and "add order" to "add item".)
For more details, see TransactWriteItems documentation here.
I am reading a queue and using an Action stage to "Get Item Data" from "Work Queue" business object. The purpose of my process to prepare a report of the status of the queue items. The "Get Item Data" action expects one input, which is the queue item ID. A bunch of output items are spit out such as Key, Status, Completed DateTime, Exception DateTime...etc.
I generated Data Items for all of the output of the "Get Item Data" Action stage. I then created a loop to go over all the queue records, populate the generated data items, and then use the information in the data items to captured the details for my reporting.
The issue that I am having is that when the loop goes to the next item in the queue, it does not entirely reset the data items. For example, if the first record in the queue was in completed status, the "Completed DateTime" data item is populated with that date and time. If the next record in the queue is an exception, it populates the "Exception DateTime" data item, which is good, but it doesn't override the "Complete DateTime" data item with blank. It keeps the date from the previous record.
In my process, I check for "Completed DateTime" and "Exception DateTime" in order to determine the status of the record and update my report. The solution that I thought of is to add a Calculation stage to reset the data items, but can't seem to reset a DateTime data item. It does not like the empty quotes "". Any suggestions would be greatly appreciated!
FYI, one of the output items is called "Status", but it is not populated with any information. Otherwise, this would have been very easy.
Disclaimer: This may not be the ideal solution, but it'll work!
Use the Calculation Stage at the end of the loop, but as you cannot set a DateTime object to 'empty', how about you set them to an odd date? E.g. 01-01-4000 00:00:00.
After you finish your initial loop to populate the report (I assume something similar to Excel), you create another loop over your report and replace all the odd dates to empty cells. Alternatively you write a macro to get rid of them all at once without the need to loop.
The best solution of course would be to properly populate the Status column in your queue, but this requires access to the code and permission to alter it (and time to do so).
Is it possible to dynamically create control-M jobs.
Here's what I want to do:
I want to create two jobs. First one I call a discovery job, the second one I call a template job.
The discovery job runs against some database and comes back with an array of parameters. I then want to start the template job for each element in the returned array passing in that element as a parameter. So if the discovery job returned [a1,a2,a3] I want to start the template job 3 times, first one with parameter a1, second with parameter a2 and third one with parameter a3.
Only when each of the template jobs finish successfully should the discovery job show as completed successfully. If one of the template job instances fails I should be able to manually retry that one instance and when it succeeds the Discovery job should become successful.
Is this possible ? And if so, how should this be done ?
Between the various components of Control-M this is possible.
The originating job will have an On/Do tab - this can perform subsequent actions based on the output of the first job. This can be set to work in various ways but it basically works on the principle of "do x if y happens". The 'y' can be job status (ok or not) exit code (0 or not) or text string in standard output (e.g. "system wants you to run 3 more jobs"). The 'x' can be a whole list of things too - demand in a job, add a specific condition, set variables.
You should check out the Auto Edit variables (I think they've changed the name of these in the latest versions) but these are your user defined variables (use the ctmvar utility to define/alter these). The variables can be defined for a specific job only or across your whole system.
If you don't get the degree of control you want then the next step would be to use the ctmcreate utility - this allows full on-the-fly job definition.
You can do it and the way I found that worked was to loop through a create script which then plugs in your variable name from your look-up. You can then do the same for the job number by using a counter to generate a job name such as adhoc0001, adhoc0002, etc. What I have done is to create n number of adhoc jobs as required by the query, order them into a new group and then once the group is complete send the downstream conditions on. If one fails then you can re-run it as normal. I use ctmcreate -input_file . Which works a treat.
Suppose that the Activity Tree has two activities (Activity 1 and Activity 2). Activity 1's cmi.exit is set to empty characterstring by default and Activity 2's cmi.exit is set to "suspend" and "adl.nav.request" to "exitAll" and then Activity 2 calls Terminate(""). I want to know that whether the current information in the Run-Time Environment data model of Activity 2 is accessible in the next Sequencing Session or this data are discarded.
"exitAll" terminates the sequencing session without saving any data at all. So, the next sequencing session will start with fresh data. To be able to retrieve data in the next sequencing session, you need to call "suspendAll" instead. When "cmi.exit" is set to suspend, the current state of the suspended activity will be saved and will be available during the same sequencing session but it will not be available in the next sequening session unless you call "suspendAll".
I'm creating a SharePoint 2010 application called HR Learning And Development. Basically it's an application to manage every employee trainings. Much like a college environment.
I have a SharePoint Lists named Training and Training Session. One training can have multiple sessions.
The Training Session lists have Trainer, Start-date and Status column among other things.
Employee who wants to enroll into this training would have to subscribe it.
The Training Session list should behave as follows :-
Status = New -> when a training session is just created
Status = In Progress -> when the Start-date > Date Time.Now
Status = Completed -> when the End-date <= Date Time.Now
Could you guys help me on how to solve this things.
Your help is very much appreciated.
Thanks
I would create a content type for the training session documents. Then either create a workflow for that list or for that content type, your preference. The workflow can be set to execute on both the creation and modification of a document. You can then do the time logic you specified within a workflow and update the status accordingly.