Part of the process of getting the script in the previous post working was having a record of who is licenced *now*.
This was done using the following script. Note that the commented lines must be un-commented to log the data, I am currently using this script to keep the AD groups up to date , hence the Add-ADGroupMember lines are uncommented. AD groups will soon be replacing the text files.
Since introducing O365 to my organisation, there has been a steady flow of people wanting to use services within it. The licences for which have been applied on an ad-hoc basis by several global admins.
Now we are in a position to make these services more official/ production, I needed to licence those users who were not yet licensed, plus sort out the improperly licensed people.
It seems that either one of those global admins who was assigning licences or maybe self registered users were getting the Information Worker licences – e.g. STANDARDWOFFPACK_IW_STUDENT. This was not desired, so my script had to sort those out as well!
Additionally, people whose status changed from being staff to anything else needed to have any staff licence revoked and replaced with a student licence – and vice versa.
I also wanted to control who should be given those individual licences, like Office Pro Plus and Dynamics. For the moment, those entitled to these licensed are held in text files – I’m working on migrating to using AD groups instead, but for now just needed something that just works!
The script logs added and removed licences, to C:\Office365-Scripts\Licencing\LicenceManagement.txt
Error emails are based on the content of the $error variable, using a function to generate the mail body.
The initial service certificate used for my ADFS service was relatively simplistic – containing only one UPN suffix.
Recently, the security team stated that they wished to start using InTune, via SCCM. Thus, I needed to enable the Workplace Join functionality on the ADFS server farm.
In order to do so, I obtained a new certificate with the additional SAN: enterpriseregistration.<MyDomain>, then I needed to replace the service certificate on the ADFS servers in the farm. This process was followed:
Install new certificate on all ADFS servers in the farm
Allow the ADFS service account read permissions on the private key
On the Primary ADFS server, set the new service certificate
Obtain the certificate thumbprint from the new certificate, e.g.: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
On all ADFS proxy servers – Set-WebApplicationProxySslCertificate -Thumbprint: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
On the Primary ADFS server – Set-AdfsSslCertificate -Thumbprint: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Restart ADFS service on all servers
8. Test, by doing a federated login via portal.microsoftonline.com
When I initially started delving into the management of my Office 365 tenancies, I was looking for a secure way of providing a username and password, with which to authenticate (connect) to MSOL.
I didn’t particularly like the idea of putting plain text usernames and passwords into my scripts. However, shortly after getting my ADFS servers up and running, I applied the Microsoft Office 365 Federation Metadata Update Automation Installation Tool. If you look at how this works, you see that it uses Credential Manager to store the username and password. It extracts these when needed to connect to MSOL. This seemed like a nice clean solution, so I “borrowed” the methodology for most of my O365 scripts.
Take note that the Windows Credential Manager is profile specific, so if you need a service account to run some of your scheduled tasks of the like; then you need to create the credential under that profile. If you ever want to change the password of the account stored in credential manager, you can change it via the GUI, but remember that you need to do it for every profile that might use it.
So, first we need to create the credential that we want to use in our scripts:
Note line 31 – $TargetName = “LicenceManagment” – this is the label of the “Credential”, this is what is used in the subsequent scripts to get the credential.
OK, so now we have the credential stored, lets test it – I often end up using this generic connection script, when I want to have a quick look at something in MSOL:
Thus, this Generic connection script is the basis for most of my management scripts, the rest is just bolted onto the bottom.
After all of the previous work, defining the pieces to get the mail notifications working, I noted that I was still getting mails that had blanks for the users details – where the attributes were being referenced like [//Target/AccountName] in the email template.
I understood the “why” – the accounts had been deleted – thus they were a transition out, but the object going through the workflow no longer had any attributes! Now to find a solution….
I thought that a custom workflow would be the most elegant solution and would give me a good opportunity to learn, although it might have been a little overkill to solve the issue, it would be a simple workflow. Oh, how wrong I was 🙂
So, after a fair few failed attempts, with mysterious compilation errors, I think I finally have it; a workflow that takes the current request and looks for the AccountName attribute – it even builds successfully (wow)!. So lets see if just that little bit works – in debugging mode….. Oh, so I also need an AIC to be created – looking at the documentation – I try this and that, getting nowhere and frustrated…
By now I have really had enough – its not really like me to give up, but for now I’m through with this black magic! I’ll come back later to dig into this again, just not now… I need a workaround to the issue for now and I know that the PowerShell Workflow will answer it – everything is possible with PowerShell!
Over the weekend I have been mulling over what I need to do using the PowerShell Workflow (http://fimpowershellwf.codeplex.com/). Looking at the Logging example script, I see that this should be easy. So, I start playing… OK, so I have my target requestor – how do I get the attribute values from it? The documentation is very good in terms of getting going, but does not offer much in the way of examples – e.g. getting the data from the PSObject. Using Get-Member, I can see that all of the attributes that I want are present, but when I refer to them in the script, by $Target.AccountName, it appears that the returned value is $NULL when run in the PS workflow. However, I can see that this is not the case in a standalone PS window.
Eventually, I find a way of getting the data out – e.g.: $Target.Get($_).AccountName, I’m pondering over why I need to jump through the hoops to get the data, but am happy that I got there in the end. I published my initial “solution” to my original question on TechNet and Brian comes back suggesting that I don’t need to use this convoluted way of getting the data – $Target.AccountName should just work!
After a few exchanges, it seems that by not including the -OnlyBaseResources switch on the Export-FimConfig cmdlet, I am getting referentially linked objects as well as the object that I have requested. By adding this switch, I can now get the data cleanly via $Target.AccountName.
To make use of this process, I now have to modify transition out process. Delete the transition out Email Template then change the workflow, so that is uses the PowerShell workflow – with the path to each PS script for each type of OU move. The html files used previously for the move operation also need to be modified to allow them to make use of the data/attributes being assigned to variables in the script.
So, here is the script that I put together, I need a copy for each move type, plus an amended html file for each move – no big deal, just some simple search replace.
Script to send email notification only if the AccountName exists:
This is another of my old, but nice scripts. There was some discussion at the time about how different Exchange team people, who work at different geographical sites, were dealing with mailbox delegation. There was no consistency.
Note that this script runs repeatedly (up to 60 times – 30 minutes) until it sees that the calendar permissions are set correctly. This was introduced to allow permissions to be set when our exchange infrastructure had been failed over to our other main site – a rare occurrence. The replication interval between those sites is set to 15 minutes. Hence, it should have successfully completed by 30 minutes or something more fundamental has gone wrong!
While this script did the job, it took a rather long time to process, it was an overnight job. If it had failed overnight, I had to wait another day before doing the comparison against production auto groups.
In terms of performance, the Lithnet PowerShell module is much faster. My standard script took ~7 hours to run, while the Lithnet script only took 3 hours!
I can see that I’m going to get a lot more use out of the module in terms of making changes as well as getting information from the portal. Thanks Ryan!
So, as per the comments, Ryan’s suggestion really improves performance of the script! Here is the new script for reference: