Since introducing O365 to my organisation, there has been a steady flow of people wanting to use services within it. The licences for which have been applied on an ad-hoc basis by several global admins.
Now we are in a position to make these services more official/ production, I needed to licence those users who were not yet licensed, plus sort out the improperly licensed people.
It seems that either one of those global admins who was assigning licences or maybe self registered users were getting the Information Worker licences – e.g. STANDARDWOFFPACK_IW_STUDENT. This was not desired, so my script had to sort those out as well!
Additionally, people whose status changed from being staff to anything else needed to have any staff licence revoked and replaced with a student licence – and vice versa.
I also wanted to control who should be given those individual licences, like Office Pro Plus and Dynamics. For the moment, those entitled to these licensed are held in text files – I’m working on migrating to using AD groups instead, but for now just needed something that just works!
The script logs added and removed licences, to C:\Office365-Scripts\Licencing\LicenceManagement.txt
Error emails are based on the content of the $error variable, using a function to generate the mail body.
The initial service certificate used for my ADFS service was relatively simplistic – containing only one UPN suffix.
Recently, the security team stated that they wished to start using InTune, via SCCM. Thus, I needed to enable the Workplace Join functionality on the ADFS server farm.
In order to do so, I obtained a new certificate with the additional SAN: enterpriseregistration.<MyDomain>, then I needed to replace the service certificate on the ADFS servers in the farm. This process was followed:
Install new certificate on all ADFS servers in the farm
Allow the ADFS service account read permissions on the private key
On the Primary ADFS server, set the new service certificate
Obtain the certificate thumbprint from the new certificate, e.g.: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
On all ADFS proxy servers – Set-WebApplicationProxySslCertificate -Thumbprint: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
On the Primary ADFS server – Set-AdfsSslCertificate -Thumbprint: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Restart ADFS service on all servers
8. Test, by doing a federated login via portal.microsoftonline.com
When I initially started delving into the management of my Office 365 tenancies, I was looking for a secure way of providing a username and password, with which to authenticate (connect) to MSOL.
I didn’t particularly like the idea of putting plain text usernames and passwords into my scripts. However, shortly after getting my ADFS servers up and running, I applied the Microsoft Office 365 Federation Metadata Update Automation Installation Tool. If you look at how this works, you see that it uses Credential Manager to store the username and password. It extracts these when needed to connect to MSOL. This seemed like a nice clean solution, so I “borrowed” the methodology for most of my O365 scripts.
Take note that the Windows Credential Manager is profile specific, so if you need a service account to run some of your scheduled tasks of the like; then you need to create the credential under that profile. If you ever want to change the password of the account stored in credential manager, you can change it via the GUI, but remember that you need to do it for every profile that might use it.
So, first we need to create the credential that we want to use in our scripts:
Note line 31 – $TargetName = “LicenceManagment” – this is the label of the “Credential”, this is what is used in the subsequent scripts to get the credential.
OK, so now we have the credential stored, lets test it – I often end up using this generic connection script, when I want to have a quick look at something in MSOL:
Thus, this Generic connection script is the basis for most of my management scripts, the rest is just bolted onto the bottom.
The HR data source, that I currently receive person data from, has historically had data quality issues. These are much better than they were in the past, but still cause a few issues.
When I attended FIM training at OCG, I raised the issue of data cleanliness and was told in simple terms – make sure the input data is clean! If only life was so simple…..
Back to reality, I have had to add code to my Advanced Flows to deal with, clean up and validate the input data.
A nice example follows – importing Surname from HR – dealing with:
Just plain bad data (null as a string/ value)
Validation (characters that should not be present – via regex replace)
Clean up (removing spaces from around hyphens – double barrelled names).- there is also a bit of trimming to remove and spaces before or after the string value
Surname missing!
Things like this remind me of why “Codeless Provisioning” was something I fought to get working (for too long), but ultimately had to abandon in favour of using code for almost everything. Doing so has been a real panacea for all of the rules and other funnies that I have had to accommodate.
Note: I made a little edit – I was not checking for the presence of AccountName before raising errors – should that attribute have been missing (highly unlikely, but not unknown to occur), that would have raised an error in itself. The edited code is a little more robust!
A few days ago, I made a mistake while messing with my test set. I was getting bored of making myself drop in and out of a set (using the GUI) by changing my AccountName between my real account name and my AccountName plus another character. I was doing this to test my PowerShell workflow.
I was looking to use the Lithnet PowerShell cmdlet to add and remove me as a manual member. While trying to make this work, I removed my AccountName from the criteria based membership – thus the mistake was made!
I saw rather a lot of PowerShell workflows kicking off. Initially, I didn’t understand what was happening, until I looked at the requests and saw that each request was for a different person. By removing my AccountName from the criteria based membership, I had opened up the membership of my Test Set to “All Users”, thus all users fell into the set and kicked off a PS workflow. Bugger!
I found this: http://www.integrationtrench.com/2013/09/use-powershell-to-cancel-bunch-of-fim.html, which I’ll leave here for future reference. But in the end decided to just let the workflows progress, it was late in the day anyway, they would be done by the morning. A quick test of the commands in the referenced post indicated that it was going to take an age to get all of those requests the were in a Post Processing state.
Of course by the time it had got them all, many may have been processed by then anyway..
Lesson learnt: change the scope of a set carefully!
Now that my notifications are ready to go and there are a few real recipients (testers), I was considering what would happen when the FIMMA jobs were run – lots of emails!
The FIMMA had not been run properly for a few days – this is still a system in building – so there were a lot of attribute changes to export. Also, I’d had to put an MA back in temporarily, to fix up some bad end dates – I needed to fix precedence so that this was now lower – thus quite a few end dates had changed in the intervening period. That MA will be removed again shortly…
So, I needed to quickly disable all of the MPR’s that would trigger sending the emails. Again the Lithnet RMA was a great and quick solution!
Comment out/ uncomment out the true/ false lines as appropriate. I’ll need to re-enable all of these again once the FIMMA has run through.
Looking at what to do after creating all of the PowerShell WorkFlow scripts and respective html files, I started looking at what I’d need to change in my Lithnet XML configuration file.
Wouldn’t it be nice, if I could use that to update the configuration? What do I need to change? The MPR’s can remain, the sets are still valid, so only two changes per process/ collection – remove the email template, as it is now superfluous, and update the workflow definition.
What does a PowerShell XOML look like? I have no idea!
But using the Lithnet RMA, I can get it from my development workflow:
Great, so now I only need to comment out (or delete) the “Out” email templates and modify the existing XOML files for the “Out” operations to reflect the PowerShell workflow – pointing at the correct script – easy. Then run the Import-RMConfig cmdlet again to import those changes. I’ll just delete those few mail templates manually.
After all of the previous work, defining the pieces to get the mail notifications working, I noted that I was still getting mails that had blanks for the users details – where the attributes were being referenced like [//Target/AccountName] in the email template.
I understood the “why” – the accounts had been deleted – thus they were a transition out, but the object going through the workflow no longer had any attributes! Now to find a solution….
I thought that a custom workflow would be the most elegant solution and would give me a good opportunity to learn, although it might have been a little overkill to solve the issue, it would be a simple workflow. Oh, how wrong I was 🙂
So, after a fair few failed attempts, with mysterious compilation errors, I think I finally have it; a workflow that takes the current request and looks for the AccountName attribute – it even builds successfully (wow)!. So lets see if just that little bit works – in debugging mode….. Oh, so I also need an AIC to be created – looking at the documentation – I try this and that, getting nowhere and frustrated…
By now I have really had enough – its not really like me to give up, but for now I’m through with this black magic! I’ll come back later to dig into this again, just not now… I need a workaround to the issue for now and I know that the PowerShell Workflow will answer it – everything is possible with PowerShell!
Over the weekend I have been mulling over what I need to do using the PowerShell Workflow (http://fimpowershellwf.codeplex.com/). Looking at the Logging example script, I see that this should be easy. So, I start playing… OK, so I have my target requestor – how do I get the attribute values from it? The documentation is very good in terms of getting going, but does not offer much in the way of examples – e.g. getting the data from the PSObject. Using Get-Member, I can see that all of the attributes that I want are present, but when I refer to them in the script, by $Target.AccountName, it appears that the returned value is $NULL when run in the PS workflow. However, I can see that this is not the case in a standalone PS window.
Eventually, I find a way of getting the data out – e.g.: $Target.Get($_).AccountName, I’m pondering over why I need to jump through the hoops to get the data, but am happy that I got there in the end. I published my initial “solution” to my original question on TechNet and Brian comes back suggesting that I don’t need to use this convoluted way of getting the data – $Target.AccountName should just work!
After a few exchanges, it seems that by not including the -OnlyBaseResources switch on the Export-FimConfig cmdlet, I am getting referentially linked objects as well as the object that I have requested. By adding this switch, I can now get the data cleanly via $Target.AccountName.
To make use of this process, I now have to modify transition out process. Delete the transition out Email Template then change the workflow, so that is uses the PowerShell workflow – with the path to each PS script for each type of OU move. The html files used previously for the move operation also need to be modified to allow them to make use of the data/attributes being assigned to variables in the script.
So, here is the script that I put together, I need a copy for each move type, plus an amended html file for each move – no big deal, just some simple search replace.
Script to send email notification only if the AccountName exists:
Here is the script used to migrate the clients from the old servers to the new domain based DFS namespace, note that from_srv and to_srv need to be amended and the case statement needs to be constructed based on the content of the Drive Mapping script output.