Cleaning and validating input data

The HR data source, that I currently receive person data from, has historically had data quality issues. These are much better than they were in the past, but still cause a few issues.

When I attended FIM training at OCG, I raised the issue of data cleanliness and was told in simple terms – make sure the input data is clean! If only life was so simple…..

Back to reality, I have had to add code to my Advanced Flows to deal with, clean up and validate the input data.

A nice example follows – importing Surname from HR – dealing with:

  • Just plain bad data (null as a string/ value)
  • Validation (characters that should not be present – via regex replace)
  • Clean up (removing spaces from around hyphens – double barrelled names).- there is also a bit of trimming to remove and spaces before or after the string value
  • Surname missing!

Things like this remind me of why “Codeless Provisioning” was something I fought to get working (for too long), but ultimately had to abandon in favour of using code for almost everything. Doing so has been a real panacea for all of the rules and other funnies that I have had to accommodate.

Note: I made a little edit – I was not checking for the presence of AccountName before raising errors – should that attribute have been missing (highly unlikely, but not unknown to occur), that would have raised an error in itself. The edited code is a little more robust!

Cancelling FIM Requests

A few days ago, I made a mistake while messing with my test set. I was getting bored of making myself drop in and out of a set (using the GUI) by changing my AccountName between my real account name and my AccountName plus another character. I was doing this to test my PowerShell workflow.

I was looking to use the Lithnet PowerShell cmdlet to add and remove me as a manual member. While trying to make this work, I removed my AccountName from the criteria based membership – thus the mistake was made!

I saw rather a lot of PowerShell workflows kicking off. Initially, I didn’t understand what was happening, until I looked at the requests and saw that each request was for a different person. By removing my AccountName from the criteria based membership, I had opened up the membership of my Test Set to “All Users”, thus all users fell into the set and kicked off a PS workflow. Bugger!

I found this: http://www.integrationtrench.com/2013/09/use-powershell-to-cancel-bunch-of-fim.html, which I’ll leave here for future reference. But in the end decided to just let the workflows progress, it was late in the day anyway, they would be done by the morning. A quick test of the commands in the referenced post indicated that it was going to take an age to get all of those requests the were in a Post Processing state.

Of course by the time it had got them all, many may have been processed by then anyway..

Lesson learnt: change the scope of a set carefully!

Bulk disabling MPR’s with Lithnet RMA

Now that my notifications are ready to go and there are a few real recipients (testers), I was considering what would happen when the FIMMA jobs were run – lots of emails!

The FIMMA had not been run properly for a few days – this is still a system in building – so there were a lot of attribute changes to export. Also, I’d had to put an MA back in temporarily, to fix up some bad end dates – I needed to fix precedence so that this was now lower – thus quite a few end dates had changed in the intervening period. That MA will be removed again shortly…

So, I needed to quickly disable all of the MPR’s that would trigger sending the emails. Again the Lithnet RMA was a great and quick solution!

Comment out/ uncomment out the true/ false lines as appropriate. I’ll need to re-enable all of these again once the FIMMA has run through.

Using the Lithnet RMA to make the changes from the last post.

Looking at what to do after creating all of the PowerShell WorkFlow scripts and respective html files, I started looking at what I’d need to change in my Lithnet XML configuration file.

Wouldn’t it be nice, if I could use that to update the configuration? What do I need to change? The MPR’s can remain, the sets are still valid, so only two changes per process/ collection – remove the email template, as it is now superfluous, and update the workflow definition.

What does a PowerShell XOML look like? I have no idea!

But using the Lithnet RMA, I can get it from my development workflow:

Great, so now I only need to comment out (or delete) the “Out” email templates and modify the existing XOML files for the “Out” operations to reflect the PowerShell workflow – pointing at the correct script – easy. Then run the Import-RMConfig cmdlet again to import those changes. I’ll just delete those few mail templates manually.

Mail notification process continued…

After all of the previous work, defining the pieces to get the mail notifications working, I noted that I was still getting mails that had blanks for the users details – where the attributes were being referenced like [//Target/AccountName] in the email template.

I understood the “why” – the accounts had been deleted – thus they were a transition out, but the object going through the workflow no longer had any attributes! Now to find a solution….

I put a post on the Technet forums: https://social.technet.microsoft.com/Forums/en-US/b9bf96b9-ac1f-4039-889d-8063fe45058f/mail-notification-with-blank-targetaccountname?forum=ilm2, to which Brian offered some useful suggestions.

I thought that a custom workflow would be the most elegant solution and would give me a good opportunity to learn, although it might have been a little overkill to solve the issue, it would be a simple workflow. Oh, how wrong I was 🙂

I started trawling around looking for examples to get me started, mostly in C# – which I’m not particularly comfortable with! Of particular note was http://www.fimspecialist.com/fim-portal/custom-workflow-examples/ and this https://social.technet.microsoft.com/Forums/en-US/66924b6d-8331-4389-a34e-8e45ab9cc765/custom-workflow-activity?forum=ilm2, which helped with the method to get the AccountName attribute.

So, after a fair few failed attempts, with mysterious compilation errors, I think I finally have it; a workflow that takes the current request and looks for the AccountName attribute – it even builds successfully (wow)!. So lets see if just that little bit works – in debugging mode….. Oh, so I also need an AIC to be created – looking at the documentation – I try this and that, getting nowhere and frustrated…

So, then I find Carol Wapsphere’s VB.net logging example http://www.wapshere.com/missmiis/the-fim-2010-custom-logging-activity-in-vb-net. Great, lets use this as a basis. But first make sure I can get it working on its own…… Failure again…Why?? AIC?

By now I have really had enough – its not really like me to give up, but for now I’m through with this black magic! I’ll come back later to dig into this again, just not now… I need a workaround to the issue for now and I know that the PowerShell Workflow will answer it – everything is possible with PowerShell!

Over the weekend I have been mulling over what I need to do using the PowerShell Workflow (http://fimpowershellwf.codeplex.com/). Looking at the Logging example script, I see that this should be easy. So, I start playing… OK, so I have my target requestor – how do I get the attribute values from it? The documentation is very good in terms of getting going, but does not offer much in the way of examples – e.g. getting the data from the PSObject. Using Get-Member, I can see that all of the attributes that I want are present, but when I refer to them in the script, by $Target.AccountName, it appears that the returned value is $NULL when run in the PS workflow. However, I can see that this is not the case in a standalone PS window.

Eventually, I find a way of getting the data out – e.g.: $Target.Get($_).AccountName, I’m pondering over why I need to jump through the hoops to get the data, but am happy that I got there in the end. I published my initial “solution” to my original question on TechNet and Brian comes back suggesting that I don’t need to use this convoluted way of getting the data – $Target.AccountName should just work!

After a few exchanges, it seems that by not including the -OnlyBaseResources switch on the Export-FimConfig cmdlet, I am getting referentially linked objects as well as the object that I have requested. By adding this switch, I can now get the data cleanly via $Target.AccountName.

To make use of this process, I now have to modify transition out process. Delete the transition out Email Template then change the workflow, so that is uses the PowerShell workflow – with the path to each PS script for each type of OU move. The html files used previously for the move operation also need to be modified to allow them to make use of the data/attributes being assigned to variables in the script.

So, here is the script that I put together, I need a copy for each move type, plus an amended html file for each move – no big deal, just some simple search replace.

Script to send email notification only if the AccountName exists:

The Email body file is also amended as shown:

File server migration script with case statement

Here is the script used to migrate the clients from the old servers to the new domain based DFS namespace, note that from_srv and to_srv need to be amended and the case statement needs to be constructed based on the content of the Drive Mapping script output.

File server migration – Drive Mapping Script

A long time before planning a file server migration, I thought that it would be interesting to see which drives were mapped by our users.

We do not use a login script or GPO Preferences to do this and there is a long history of “Stuff” on the old servers.

Initially, it was down to help the helpdesk staff to re-map drives on new machines; where the user didn’t know where the target was – it was just their J drive….

Coming back to the file server migration, the historical data in this file was invaluable.

By comparing the drive mapping data with the shares defined on the target server, a list of share names and their long path was produced. This list is then used to construct the case statement within the server migration/ re-mapping script.

I’ll publish the drive-mapping script here now, as it was easy to sanitise. I’ll provide the re-mapping and rollback scripts separately.

Note that this script must be run as a Logon script in the user part of a GPO. Additionally, the target folder is a very open hidden share.

Exchange Mailbox Delegation

This is another of my old, but nice scripts. There was some discussion at the time about how different Exchange team people, who work at different geographical sites, were dealing with mailbox delegation. There was no consistency.

I read at the time about using EWS for Exchange management – notably this: http://gsexdev.blogspot.com/2009/04/add-delegates-to-mailbox-with.html. I wrapped that method in a bit more PowerShell to provide a consistent, simple to use interface for the Exchange team members.

Note that this script runs repeatedly (up to 60 times – 30 minutes) until it sees that the calendar permissions are set correctly. This was introduced to allow permissions to be set when our exchange infrastructure had been failed over to our other main site – a rare occurrence. The replication interval between those sites is set to 15 minutes. Hence, it should have successfully completed by 30 minutes or something more fundamental has gone wrong!

Creating a professional looking “Legal Text”

So a few years ago, our Security group asked us to implement a Legal warning on all Windows clients.

The simple way of doing ths is to just add the Title and Text to the following GPO settings:

Computer Configuration/Windows Settings/Local Policies/Security Options –

  • Interactive logon: Message title for users attempting to log on
  • Interactive logon: Message text for users attempting to log on

However, these settings provide no layout options whatsoever. To try to force some semblance of a layout, you need to introduce a character (we used *) at the beginning of each line then add spaces to get an indent. Adding carriage returns in the GPO has no effect! You could just plop the text in as one long line…. Either way it looks crap!

We also have some display screens scattered about that auto-logon. These machines has to be excluded from getting the legal notice, otherwise they would not auto-logon. The initial solution was to just create another OU for these machines to exclude them from receiving the policy settings. This is simple but messy, you forget to put them (pre-staged) in the right place before they are added to the domain, so you then need to trawl through the registry removing the settings….

I was looking for an alternative and found this article on the scripting guys website:

http://blogs.technet.com/heyscriptingguy/archive/2005/01/17/how-can-i-change-the-legal-warning-message-using-a-script.aspx

I used this as the basis for the script below. My version is applied at the root of the domain as a computer start-up script in a “Base” GPO. This way every client gets the legal text (except DC’s – the script is also applied to the DC’s separately).

Those auto-logon machines are handled automatically. A security group in the domain contains the computer accounts for those machines that you don’t want to get the legal text. If the computer account is in that group, the script ensures that the legal text is not set, if the machine already has the legal text and is then added to the group, the legal text is removed automagically!

Here is the script:

An update on Using Ryan Newington’s PowerShell Module to create a Mail Notification process

OK, so although all the XML definitions worked fine and the objects were created, the differentiator between a New User and an Existing User moving into a Department was a little poor.

The new user email notification was configured to use an attribute that would make it clear to the reader of the mail if it was a new or existing person. However, by the time this attribute had reached the portal (for a new user), the mail had already been sent with a blank attribute referenced…. Not good!

So a little re-think…. I need another set of Sets to handle new versus existing people for each of those groups of people who want to know!

The existing Set: OU=Blah, was used for both new and existing people where a Transition In is a New User or a user moving into that department. This one will now be used only for New Accounts only. A Transition Out was a person leaving that Department

A new Set was defined: OU=Blah AND where Portal CreationDate date is prior to Today – this will handle Existing users moving into or out of a department. Urgh, but that means that I need move in and out MPR’s, WF’s and templates, plus HTML and XOML. I am not provided Employee Start date from HR, so cannot use that.

Using the PowerShell Module, this is a little easier, but what started as just a re-hack of my original XML file turned into a slight re-design (tidy up, fixing some capitalisation (to allow for easier search/ replace) and better layout for readability) of the XML,

The existing Move MPR, WF and Template are re-purposed to become the ‘Active’ User move in (New User), then another series are created for Existing User move in and out (Movers). I also added another recipient for the initial period of testing the notifications – the real recipients will be added once I see that all is working as expected…

“—” is used as the search criteria and replaced with the relevant Department acronym – for me this joins everything up! Note that each template, once fixed, should be within the Lithnet wrapper:

The template job list is as follows:

And with slight modified XOML to match the XML, those being the references to recipients. which now has two references and the Email Template reference has its reference made to match the ID for that template in the XML, e.g.(showing “—” where the Department acronym should be) :