Operating ILM in (near) Real Time for SQL Data Sources

One of the MMSUG members tonight posted a question on the topic of managing deltas from a SQL source, and I thought it worth blogging about since it is a topic close to my heart, and part of solving the puzzle on how to make your ILM solution "event aware" and operate it in (near) real time …
 
Steve asked the following question:
 

I have a big project coming where I’m connecting amongst other things a authoritative user and group store on SQL into AD, now I have implemented and understand the multivalued attribute sql tables before etc. but not done much with the sql delta tables. Now if I implement sql delta tables then clear the tables after succesful import. There is a short time frame between running of the agent and running of the clearing of the delta table. If data gets put in there between them it will get erased before it can be processed. How do people normally deal with this? can you do anything with processed flags or timestamps perhaps?

In the process of responding to the above, I realized that Steve’s question was exactly the way to introduce the idea of how to use Event Broker with a SQL delta source.  However, before I do, let me explain my approach to handling delta views/tables with a SQL MA.  Here’s my basic approach (which is an extension on the "Generating Delta Views Using Triggers" approach documented on TechNet here):
  1. I run the following (pre-processing) SQL on a cycle:
    if not exists (select 1 from <myDeltaTable> where processingStatus = ‘processing’
    Begin
      update <myDeltaTable>
      set processingStatus = ‘processing’
      where processingStatus = Null
    End
  2. I run my ILM delta import using a delta view which is effectively "select * from <myDeltaTable> where processingStatus = ‘processing’".  Note that when I am using our Event Broker service for ILM, I can run this on say a 10 second interval, and only continue to this step if my existence test above returns true.  If you are not using this service, you have to run your delta import on whatever schedule you’ve decided to employ, but you need to use something like the MIIS Toolkit’s Scheduler tool to allow you to run your pre-process step as a DOS batch file, VB script, or suchlike.
  3. I then only run the following ‘post processing’ step if the ILM run profile returns a success (or equivalent) status:
      update <myDeltaTable>
      set processingStatus = ‘processed’
      where processingStatus = ‘processing’
  4. So that I can prevent performance degradation over time, what I often do is archive into a <myHistoryTable> as an extra step, e.g.
    insert into <myHistoryTable>
    select * from <myDeltaTable>
    where processingStatus = ‘processed’
    go
    delete from <myDeltaTable>
    where processingStatus = ‘processed’

With the above, I my delta table generally has a "processIndicator" attribute with a value of D/I/U for delete/insert/update … although the above instructions are not how to construct a delta table, but how to use it to construct a delta view.  In other words I am adopting a "tri state" approach with my "processingStatus" attribute, and it is this same attribute which is the key to running your ILM SQL MA delta imports on demand rather than on a schedule … here’s how:

  1. I use the Event Broker "SQL Changes" plugin to detect the presence of records with null values, and at this time update the processingStatus to "processing" (using a pre-processing SQL script/stored proc).  As with all Event Broker change detection plugins, the plugin returns a TRUE value if there is something to do, and FALSE otherwise.
  2. When the plugin returns TRUE (and this will happen immediately after some delta records have been updated with a status of ‘processing’, and continue to happen until such time as the ‘proceessing’ records are updated to ‘processed’), only then do I initiate what is termed an Event Broker "outgoing OpList".  The first of the standard two steps to run for this OpList is my ILM delta import/delta sync.  This in turn can set up pending exports in any number of other MAs, and Event Broker is designed to fire the corresponding export run profiles off as and when these pending exports are set up.  This leads to more pending (confirming) imports … you get the idea.
  3. The second of the Outgoing Oplist steps is my "post processing" step to update the "processing" records to a processingStatus of "processed", and generally follow this up in the same SQL script with the archiving of the processed records to keep the process from performance degradation over time

 There you have it.  I hope to follow this up soon with an installment on how to manage other ILM MAs using corresponding Event Broker plugins, such as the AD/ADLDS Changes plugin and File Changes plugin.

 
 
Advertisement

About bobbradley1967

Microsoft Identity and Access Professional with 2 decades of successful IAM implementations in APAC, specialising in MIM and its predecessors (FIM/ILM/MIIS) and now with SoftwareIDM. A Microsoft IAM MVP prior to that with a background in MS.Net applications development/SI. Now with a particular interest how Identity and HyperSync Panel provide the Identity and Access orchestration presently missing in the Azure Entra Suite to effectively enforce Zero Trust on the M365 platform.
This entry was posted in Event Broker for FIM 2010, FIM (ForeFront Identity Manager) 2010 and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.