Managing Office 365 Licenses with #MIM2016 and #AzMan – Part 2

In my last post I introduced the concept of using Windows Authorisation Manager (AzMan) to manage the automation of Office 365 licenses.  In this post I will go into detail on how the solution hangs together.

Complementing AAD Connect with MIM 2016

The main components of my solution are as follows:

  • AADConnect (OOTB identity bridge between AD and AAD for user/group/contact sync)
  • MIM 2016 (to complement AADConnect)
  • AzMan (SQL database storage option)
  • 2 MIM 2016 management agents (e.g. PowerShell)
    • User License Entittlements (sync source for user.assignedLicenses)
    • User License Assignment (sync target for user.assignedLicenses)

As many of you will know, it is technically possible to extend AADConnect to include additional management agents and negate the need for MIM2016, however this was not my recommended approach for many reasons, including the following:

  • AADConnect extensions will invariably require extended Microsoft support over and above the standard AADConnect support options;
  • MIM2016 synchronisation remains a cost-effective option for any enterprise IAM architecture on the Microsoft platform; and
  • The MIM synchronisation configuration for this solution requires no rules extensions (only direct attribute flows)

Aside from the AzMan approach itself, the benefits of using the MIM 2016 sync engine (or FIM 2010 R2) in lieu of the traditional scripted approach should not be surprising for any of us, and include the following:

  • State based approach – provide ongoing reconciliation between expected (entitlements) and actual (assignments), avoiding the need for report-based options such as this one;
  • Timeliness of changes – delta sync cycles can be used to process only the changes since the last sync cycle;
  • Maintainability – maintaining a centralised authoritative source of rules/roles governing license entitlements ensures ongoing ease of-maintenance; and
  • Auditability – enforcing a centralised authoritative source of rules/roles governing license entitlements ensures license accountability and compliance.


Creating the AzMan Store

Setting up an AzMan instance itself is straight-forward given this is a standard feature maintainable via a simple Windows MMC snap-in.

While there are 3 options for hosting my AzMan store, I chose SQL over XML and AD – a decision made easier by the fact I could reuse a SQL alias I already had in place for my MIM Sync service.  For more information on the choice of store, see Managing Authorization Stores.

  • Click on the Authorization Manager node in the LHS tree view
  • From the Action menu, select Options…
  • Click on Developer mode
  • From the Action menu, select New Authorization Store…
  • Select the Microsoft SQL store type and Shema version 2.0, then paste the following string into the Store name:
    mssql://Driver={SQL Server};Server=MIM01;/AzManStore/O365.Licensing

    … and in the Description enter:

    Authorization store defining QBE O365 licensing entitlements.
    * role definitions = SKUs and disabled plans
    * task definitions = plans
    * role assignments = AD group memberships required to fulfill user license entitlements
    For more details, including a full list of indexed service plans (tasks) see


  • Click OK
  • Check that the (initialized) Licensing store is correctly displayed as follows:
  • Select the O365.Licensing node (store) and select New Application… from the Action menu, then enter the following details:
    • Name = O365
    • Description = Office 365 Licensing
    • Version = 2
  • Click OK

Note: The AzMan store must remain in Developer mode from this point so that configuration can continue via the MMC snap-in UI.

Configuring the AzMan Store for Office 365 Licensing

Initially I chose to set up my AzMan store data manually via the MMC snap-in.  You may choose to do this too, but I chose to script this entirely in PowerShell to assist me in the deployment phase, as well as allow my customer to maintain the master data in the way they were more comfortable (initially at least) – yes in a Microsoft Excel Spreadsheet.

Set up Task Definitions

In my AzMan model a “task” represents a service plan, or specifically a “disabled plan”.  When an SKU is assigned to a user, service plans that are to be excluded for that user are listed as disabled plans.  To get a list of all possible disabled plans I have used the latest Service Plan table from this TechNet article to come up with the following:

When entitlement details are extracted for FIM processing, both the Name and the Description (“Seq” index) are read, but only when it is associated to a role definition with at least one disabled plan.  Plans must always be extracted/listed in index order.

Set up Role Definitions

For my model a “role” represents an SKU and zero or more disabled plans.  How these are set up will most likely be specific to your enterprise’s unique requirements, but the following is an example of how this might look.  Note that the SKU name appears in the Description property – this is an important part of the data model because this SKU name must be able to be extracted for each (arbitrary but unique) role name.

The role definition for “Office 365 Enterprise E3 – no Exchange nor Skype” is highlighted because this best illustrates the concept of optional disabled plans.  Opening the properties dialog you can see this is managed like so:

When entitlement details are extracted for FIM processing, the Description is all that is read from any group-assigned role, along with the Name and Description (Seq) of any associated disabled plans.

Set up Role Assignments

This is where the real power of the model comes into play.  By assigning 1 or many AD groups to the roles constructed above, a collection of groups and a constructed string in the following form can be constructed:

SubscribedSku.skuPartNumber : ServicePlanInfo.servicePlanName ; ServicePlanInfo.servicePlanName ; ...
Plan name : disabled plan name ; disabled plan name ; ...

The following is a dummy role assignment to illustrate the concept – but this screen would normally have at least one entry per defined role, and multiple AD groups assigned to each:

Bringing it all together

The following PowerShell script takes whatever data you have in the above store and renders it in the following table format:

  • groupDN
    Single value string property, retrieved from AD from the group ID read from the AzMan role assignment, this is then able to be joined on the user.memberOf DN
  • assignedLicenses
    Multi-value string property, constructed from the role and task data linked to the role assignment – this is the data which can be directly synchronised to Azure via the GRAPH REST API.

Note that the script makes use of the “MIM” SQL alias I already have set up for my MIM SQL connection – allowing me to host the script on the MIM synchronisation host server.

# -- BEGIN PowerShell Connector Script

 # Create the AzAuthorizationStore object.
 #Add-PSSnapin Interop.AZROLESLib -ErrorAction SilentlyContinue
 $AzStore = new-object -com AzRoles.AzAuthorizationStore
 # Initialize the authorization store.
 # Refer to for initialize parameters
 $AzStore.Initialize(0,"mssql://Driver={SQL Server};Server=MIM;/AzManStore/O365.Licensing")
 # Create an application object in the store.
 $qbeApp = $AzStore.OpenApplication("O365")

 # Create HashTable of all group/assignedLicenses associations
 $licenseAssociations = @{}

 # The following is to demonstrate what the PowerShell connector would look like which acts as a lookup-table to join on each users "memberOf" property - thereby
 # allowing for multiple licenses

 # Loop through all role assignments to return the role for each member MemberNames
 foreach($assignment in $qbeApp.RoleAssignments) {
 foreach($member in $assignment.MembersName) {
 #$member # AD Domain\Group
 $groupDNAD = (Get-ADGroup -Identity $($member.Split('\')[1])).DistinguishedName
 $groupDN = "CN=$((Get-ADGroup -Identity $($member.Split('\')[1])).name),OU=LicensedUsers,DC=IdentityBroker"
 #$assignment.Name # EMS role
 $role = $qbeApp.OpenRole($assignment.Name)
 foreach($def in $role.RoleDefinitions) { # disabled plans
 #$def.Description # SKU
 #"$($def.Description):$tasks" # assignedLicenses
 $tasks = @{}
 foreach($task in $def.Tasks) {
 $obj = [PSCustomObject]@{
 Name = $task
 Seq = [int]($qbeApp.Tasks | Where-Object {$_.IsRoleDefinition -ne 1 -and $_.Name -eq $task}).Description
 foreach($task in $tasks.Values | Sort-Object Seq) {
 if($tasksAsString.Length -gt 0) {
 $assignedLicense = $def.Description
 if ($tasksAsString.Length -gt 0) {
 $assignedLicense += ":$tasksAsString"

 #Repeat/loop through each item in a target system
 foreach($group in $licenseAssociations.Groups.Keys) {

# -- END Connector Script


Note that the script above outputs the groupDN and assignedLicenses  properties to the console – for your own MIM connector implementation you will need to take this to the next steps yourself, i.e. essentially merging the group-keyed data with user-keyed data, joining on user.memberOf, and present a consolidated data set to MIM in the form of a User License Entitlements MA.

Important Important: The script now takes care to order tasks as “disabled plans” according to the (integer) index (stored in the task description).  This ensures that the assignedLicenses are reflected back in the order that they were sent (vital for a sync configuration with MIM).


While I won’t go into the specifics of the MIM side of the implementation – mainly because the preferences for this will vary from place to place – I trust that I have presented a repeatable approach that can get us away from the ongoing pitfalls of either a scripted approach, or any other sync approach which does not provide the same levels of extensibility and manageability.

In my case user volume levels meant that a MIM PowerShell MA was not going to suffice, and I needed to use UNIFY Identity Broker to provide the scalability and throughput to meet target SLAs, others may find that a PowerShell MA approach is adequate – at least for now.  If anyone is interested in exactly how I took this idea to the “next level” in this way, I would be more than happy to elaborate – but for now I felt that it was important to share the above approach as a superior way forward to Automatically Assign Licenses to Your Office 365 Users.


Some of you will be aware that AzMan has been marked as deprecated as of Server 2012 R2, which means that this will be the last version of the OS which will incorporate this feature.  In case you’re wanting more info on this, read this MS blog post.

However, rest assured I took this into account when investing energy into the above solution – noting that it will be “… well into 2023 before we see the last of AzMan” and by then I am expecting there to be an even better way of providing role/rule-based license allocation for Office 365.  Regardless, I needed something which:

  • Supported the O365 licensing data structure I was modelling;
  • Had a “free” UI which allowed me to link to AD groups; and
  • Had an API which I could use to firstly read the data the way I needed to, as well as load the data from a master source.

… and this definitely ticked all the boxes for me – noting that I wasn’t actually using the main feature AzMan was designed for, namely its access checking API for buttons/panels on forms.  If I can get 6 years of use out of this between now and 2023 then great :).

Posted in FIM (ForeFront Identity Manager) 2010, MIM (Microsoft Identity Manager) 2016, Windows AzMan | Tagged , , | 9 Comments

Managing Office 365 Licenses with #MIM2016 and #AzMan – Part 1

One of the things we Microsoft FIM/MIM folks find ourselves doing of late is having to find ways of automating Office 365 license assignment for our “hybrid” (AD+AAD) customers, initially as part of provision the initial Exchange Online mailbox which requires that initial Enterprise license.

Microsoft’s AADConnect “identity bridge” solution superseded DirSync, and more recently AADSync, as the preferred solution for hybrid enterprises, bringing with it support for the many multi-forest/single tenant scenario which was once the domain of FIM and the Windows Azure Active Directory (WAAD) connector.  There was hope that with this would come an automation option for license management, but to date there has been no answer forthcoming.  Consequently most have adopted a PowerShell-scripted approach utilising the Office365 PowerShell API, assigning licenses with a post-AAD sync step, and perhaps even automating this based on on-premises AD group membership.

Until now a certain client’s FIM 2010 solution has performed just this function, but while it performs its role admirably, it presently only goes as far as assigning the initial E3 license applicable for an active user, and revoking all licenses on termination, it does not extend to managing all licenses in between those 2 events.  While there is some extensibility to the model, configuration has been a little messy – requiring ongoing maintenance of the relationship between groups and their associated O365 SKU (and optional “disabled plans”). The FIM 2010 configuration solution consisted of

  • The standard WAAD and AD connectors that formed the original DirSync configuration, but with a custom Metaverse design;
  • An LDIF MA to import user.memberOf (not an available attribute for the OOTB AD MA) for each O365 license group for which an entitled user is a member; and
  • A PowerShell MA to set the “isLicensed” property to true/false for active/inactive users respectively, and at the same time assign the correct SKU/disabled plan combination applicable for newly provisioned AAD user accounts.

While the automated management of the group membership in this particular instance was already taken care of, I needed to find an alternative approach that was not only more extensible, but was also

  • more flexible
  • more granular in its entitlement management, and
  • more administrator-friendly.

While an Office Premium feature (still in preview???) goes some of the way to achieving this through the use of role groups, this was not able to tick all the boxes the customer was looking for.  Furthermore, the FIM 2010 configuration which originally replaced DirSync now needed to be swapped out with AADConnect, and in order to keep the AADConnect design as close to OOTB as possible, I needed to relocate the license management to another MIM instance which was performing another identity synchronisation role for the same AD user base.  This is where I thought of that rather unloved component of the Windows Server OS – Microsoft Authorization Manager, or AzMan.

What is AzMan and how can it help?

Microsoft Authorization Manager (AzMan) is a component of the Windows operating system which allows a consistent model to be used for driving application access.

“AzMan is a role-based access control (RBAC) framework that provides an administrative tool to manage authorization policy and a runtime that allows applications to perform access checks against that policy. The AzMan administration tool (AzMan.msc) is supplied as a Microsoft Management Console (MMC) snap-in.”

As a result of recent enhancements to the framework and its API, this lends itself quite nicely to modelling roles and groups for O365 licensing and license assignment.

The AzMan model and UI (MMC snap-in) allows groups of users (either LDAP or built-in groups) to be mapped to roles, which in turn are made up of sub-roles, tasks, sub-tasks and operations.  With the AD license groups already taken care of, this provides an easy way of matching existing groups to roles.  Now all that had to be done was to

  1. design a suitable AzMan data structure to allow unambiguous O365 enterprise user license assignment; and
  2. design a way to dynamically translate changes in the AD group memberships to corresponding O365 license assignments in AAD using MIM synchronisation.

What will the new solution look like?

The following is a conceptual overview of the architecture to get you thinking:


In my next post I will describe my AzMan data model, share a simple script to extract the data in a way that MIM can consume it, as well as the MIM extensions (custom MAs) which deliver the desired outcomes.

Posted in Active Directory, FIM (ForeFront Identity Manager) 2010, MIM (Microsoft Identity Manager) 2016, Windows AzMan | Tagged , | 1 Comment

Building in #MIM2016 Solution Resilience

My company UNIFY is well into its 12th year of existence (as is my tenure), and our “application-driven identity management” mantra has been a core principle in our solution approach from the beginning.  With the advent of cloud and so-called “hybrid” (off/on-premise) identity management I have not wavered from the belief that nothing changes in this regard.  That is despite the popular misconception from certain quarters that identity starts and ends with your on-premise directory.  Let’s just say your AD makes a lousy source of truth (SoT) for identity!

The benefits of aligning enterprise identity lifecycle to its HR platform are many, and at UNIFY we like to focus not just on the various sources of truth (e.g. students vs. staff in an education context), but on the events that should trigger change to an identity profile.  When it comes to harnessing these events, which may be in HR, the on-premise or cloud directory, or even in an LOB application such as a CRM, we are confident our Broker approach is second to none in driving timely identity synchronisation on platforms such as Microsoft Identity Manager (MIM) or any of its predecessors.  Our investment over more than a decade in a common application directory platform for driving IAM solutions is now paying dividends in the form of application sources such as HR systems being presented to an IAM platform such as MIM as an LDAP directory in its own right.

Yet behind the LDAP layer, not all HR systems are created equal … or rather, “some are more equal than others” … but more importantly, not all HR processes, be they BAU (business-as-usual) or EOM (end-of-month), are going enable the HR platform to lend itself to becoming a proxy identity source from the outset.  Put another way … it’s unlikely you will find any HR manager’s KPIs are related to driving an IAM solution.  Here are some possible questions that might come to mind for the HR system owner:

  • Did anyone ask me if I should allow my HR system to become the primary authoritative ‘source of truth’ for all enterprise directory employee user profiles?
  • Where are all my extra staff going to come from to allow me to meet these new SLAs?
  • Why didn’t anyone think to tell me that entering the backlog of new staff records the day before the monthly pay cycle isn’t going to cut it any more?
  • I wonder if anyone has thought about what should happen when a contractor (or a bunch of them) take on a permanent role?

What the HR manager is NOT likely to ask, however, is the following:

  • How might the IAM platform perform during nightly batch processes?
  • I wonder when would be the best time to take the HR system offline for scheduled maintenance and backups?
  • What if I forget to tell anyone when I upgrade the HR platform or extend the schema?
  • What happens if I want to set up some new test processes in my production environment?
  • Do you think it would be OK for me to delete and reload the entire employee table each night (don’t laugh – I’ve seen this one!)?

Lately I’ve been revisiting the fundamental application-driven principles I’ve taken for granted as being the basis for all good IAM solutions, and asking this question:

“What if (unexpected) sh*t happens?”

The question kind of answered itself, and quickly became this:

“Given that it is inevitable that (unexpected) sh*t will happen, who’s responsible for dealing with it?”

Just quietly, I may have been guilty in the past of taking the high ground (subconsciously at least) by thinking to myself (perhaps even out loud) “That’s not a matter for the IAM solution to deal with it – all problems must surely be addressed at the root!”.  Yet most enterprise environments are always in a state of flux, and it doesn’t really help anyone by ducking the problem when it might turn out that you are best equipped to make a difference in this regard – with a just bit of planning and lateral thinking.  This is not to say that the source systems are absolved from responsibility – far from it – but rather that it is better to be pro-active and take preventative action to avert a problem that has possibly yet to be seriously considered.

At this month’s (April 2016) MIMTeam User Group Skype Meeting I am presenting the topic of “Watch out for that Iceberg!”.  In this session (which includes a demo of a repeatable  MIM approach I wish to share with you), I will be asking what can we (as IAM consultants, solution architects and implementers) do to protect our customers or our own companies from unwanted SoT changes?  In particular, how can we be prepared for when unwanted changeshappen in large volumes, and wreak havoc on the unsuspecting systems and processes you’ve painstakingly aligned with that SoT?  What does the term “resilience” mean when used in the context of your IAM solution?

Please join me on the call (see when this is in my timezone) – looking forward to sharing some thoughts and ideas on this topic.

Posted in Active Directory, FIM (ForeFront Identity Manager) 2010, MIM (Microsoft Identity Manager) 2016, Uncategorized | Tagged , , | 2 Comments

Using .Where instead of | Where-Object

I’ve been fighting a problem today whereby the PowerShell Where-Object commandlet was returning results of varying object types from the same XML document.  Specifically, when trying to check for the numbers of adds/deletes/updates from a CSExport xml file, where I had isolated the deltas to the following:

$deltas = $csdata.SelectNodes("//cs-objects/cs-object[@object-type='$addObjectClass']/pending-import/delta")

If I then used the Where-Object construct as follows:

$deletes = $deltas | Where-Object {$_."operation" -eq "delete"}

… I would either get an object back of type System.Xml.XmlLinkedNode or System.Xml.XmlNodeSet.  Because I simply wanted to access the Count method of the XmlNodeSet, this was failing (returning nothing) when the result was of type System.Xml.XmlLinkedNode. In looking at the available methods of my $deltas System.Xml.XmlNodeSet variable I noticed the Where method … another piece of pure gold!  The syntax is slightly different if I use this:

$deletes = $deltas.Where({$_."operation" -eq "delete"})

… BUT, the result is always System.Object[].

This means that regardless of what XML I am confronted with, the most reliable means of counting the number of changes is to use the “Where” method of the System.Xml.XmlNodeSet class.

One step closer to a reliable means of consistently counting Pending Import and Pending Export changes from either my CSExport or Run Profile audit drop (xml) files :).

Posted in Event Broker for FIM 2010, FIM (ForeFront Identity Manager) 2010, MIM (Microsoft Identity Manager) 2016, Uncategorized, XML Programming | 1 Comment

Using -ReadCount 0 with Get-Content

Not that I’m an expert in PowerShell by any means, but here’s my tip of the day … use the -ReadCount parameter with Get-Content!

From the Get-Content page on TechNet …

Specifies how many lines of content are sent through the pipeline at a time. The default value is 1. A value of 0 (zero) sends all of the content at one time.
This parameter does not change the content displayed, but it does affect the time it takes to display the content. As the value of ReadCount increases, the time it takes to return the first line increases, but the total time for the operation decreases. This can make a perceptible difference in very large items.

The line in bold above for me was pure gold!  Right now I am working with parsing very large #FIM2010/#MIM2016 audit drop files to check change thresholds, and PowerShell is my tool of choice, not just because it integrates natively with MIM Event Broker (this idea will be the subject of another post a bit later).

Example for me just now with a 100Mb XML file:

  • 5 mins 17 seconds to load WITHOUT specifying any -ReadCount, with a memory footprint growth to 3.5 Gb
  • 39 seconds with -ReadCount 0, and a memory footprint growth of only 750 Mb.

Importantly, be aware that this setting is not appropriate when your script is still under construction, when the value should still be 1 (default).  In my case I found that stepping through my script in debug with this set to 0 resulted in painful delays reloading variable XML variables.  I am now thinking of using a $debug variable to toggle between 0 and 1 as appropriate – it should definitely be 0 once the script has been deployed.

So when loading large XML files in future I will certainly be using -ReadCount 0 unless I have a good reason not to – one that is worth taking 8 times longer and 5 times the resources :).

Posted in Event Broker for FIM 2010, FIM (ForeFront Identity Manager) 2010, MIM (Microsoft Identity Manager) 2016, Uncategorized | Leave a comment

#FIM2010 MIISActivate – FIM Sync service terminated with service-specific error %%-2146234334

Just posted by Peter Geelen – thought this worthy of a reblog for the #FIM2010 and #MIM2016 community.

Identity Underground

This article has been posted on TNWiki at: FIM2010 Troubleshooting: MIISActivate – FIM Sync service terminated with service-specific error %%-2146234334.


Failing over a FIM Sync Server to the standby FIM sync server using MIISActivate.

After using successfully MIISActivate, the FIMSync Service fails to start and logs an error in the eventviewer.


You’ll see 2 error messages in the event viewer, erro 7024 and error 6324.

Error 7024


This error is pretty similar or exactly like the error described in the following Wiki article:

FIM2010 Troubleshooting: FIM Sync service terminated with service-specific error %%-2146234334.


Error message Text

Log Name: System
Source: Service Control Manager
Date: 3/02/2016 15:08:59
Event ID: 7024
Task Category: None
Level: Error
Keywords: Classic
User: N/A
Computer: servername.domain.customer
The Forefront Identity Manager Synchronization Service service terminated with service-specific error %%-2146234334.
Event Xml:
<Event xmlns=””>
<Provider Name=”Service Control Manager” Guid=”{555908d1-a6d7-4695-8e1e-26931d2012f4}” EventSourceName=”Service Control Manager”…

View original post 735 more words

Posted in FIM (ForeFront Identity Manager) 2010, ILM (Identity Lifecycle Manager) 2007 | Leave a comment

The (#FIM2010) service account cannot access SQL Server …

Ran into this old chestnut just now and thought that it was worth re-visiting the outcome of an old forum post on the subject.

Before I get to the point, by way of background I always start out the installation process with a quick sanity check:

  1. Create a UDL file on the FIM Sync server desktop
  2. Configure the UDL file to connect to the SQL instance you are targeting
  3. Test for connectivity success

The above will ensure you can at least get to “first base” with SQL connectivity, negotiating firewall and network issues.

When installing the FIM Sync service any number of connectivity issues can prevent you progressing through the installer wizard.  For instance, if you’ve got a remote SQL database and you’ve forgotten to install the appropriate SQL Native Client then you will be stuck on the page configuring the SQL connection.

Once you get past this problem it’s generally onto the next … the configuration of the FIM Sync service account.  The full text of the error you might run into is this:

The service account cannot access SQL server. Ensure that the server is accesible, the service account is not a local account being used with a remote SQL server, and that the account doesn’t already have a SQL login.

The error text can be quite misleading – because (as was the case with the linked thread) the problem can be the installer access itself.  The installer account (not the service account itself) MUST be a member of the SQL sysadmin role to have any hope of progressing beyond this point.  Generally you will want to (or be asked to!) remove this access after a successful install.

Thanks to those who bother contributing answers to the TechNet forums – they are incredible time savers, often long after the threads are closed.

Posted in Uncategorized | 3 Comments