I have a big project coming where I’m connecting amongst other things a authoritative user and group store on SQL into AD, now I have implemented and understand the multivalued attribute sql tables before etc. but not done much with the sql delta tables. Now if I implement sql delta tables then clear the tables after succesful import. There is a short time frame between running of the agent and running of the clearing of the delta table. If data gets put in there between them it will get erased before it can be processed. How do people normally deal with this? can you do anything with processed flags or timestamps perhaps?
- I run the following (pre-processing) SQL on a cycle:
if not exists (select 1 from <myDeltaTable> where processingStatus = ‘processing’
set processingStatus = ‘processing’
where processingStatus = Null
- I run my ILM delta import using a delta view which is effectively "select * from <myDeltaTable> where processingStatus = ‘processing’". Note that when I am using our Event Broker service for ILM, I can run this on say a 10 second interval, and only continue to this step if my existence test above returns true. If you are not using this service, you have to run your delta import on whatever schedule you’ve decided to employ, but you need to use something like the MIIS Toolkit’s Scheduler tool to allow you to run your pre-process step as a DOS batch file, VB script, or suchlike.
- I then only run the following ‘post processing’ step if the ILM run profile returns a success (or equivalent) status:
set processingStatus = ‘processed’
where processingStatus = ‘processing’
- So that I can prevent performance degradation over time, what I often do is archive into a <myHistoryTable> as an extra step, e.g.
insert into <myHistoryTable>
select * from <myDeltaTable>
where processingStatus = ‘processed’
delete from <myDeltaTable>
where processingStatus = ‘processed’
With the above, I my delta table generally has a "processIndicator" attribute with a value of D/I/U for delete/insert/update … although the above instructions are not how to construct a delta table, but how to use it to construct a delta view. In other words I am adopting a "tri state" approach with my "processingStatus" attribute, and it is this same attribute which is the key to running your ILM SQL MA delta imports on demand rather than on a schedule … here’s how:
- I use the Event Broker "SQL Changes" plugin to detect the presence of records with null values, and at this time update the processingStatus to "processing" (using a pre-processing SQL script/stored proc). As with all Event Broker change detection plugins, the plugin returns a TRUE value if there is something to do, and FALSE otherwise.
- When the plugin returns TRUE (and this will happen immediately after some delta records have been updated with a status of ‘processing’, and continue to happen until such time as the ‘proceessing’ records are updated to ‘processed’), only then do I initiate what is termed an Event Broker "outgoing OpList". The first of the standard two steps to run for this OpList is my ILM delta import/delta sync. This in turn can set up pending exports in any number of other MAs, and Event Broker is designed to fire the corresponding export run profiles off as and when these pending exports are set up. This leads to more pending (confirming) imports … you get the idea.
- The second of the Outgoing Oplist steps is my "post processing" step to update the "processing" records to a processingStatus of "processed", and generally follow this up in the same SQL script with the archiving of the processed records to keep the process from performance degradation over time
There you have it. I hope to follow this up soon with an installment on how to manage other ILM MAs using corresponding Event Broker plugins, such as the AD/ADLDS Changes plugin and File Changes plugin.