So, now we have gone full circle! There is now a need to remove the proxy settings from one of those VPN connections described in the last post.
For the machines that are supported within my group, the settings should be consistent, but there is the possibility that extra settings have been added by the user. So, I did a little discovery first:
The resultant file shows mostly what I’m expecting to see, with a few other things like Vodafone dongles, user defined connections etc.
Removal can be easily carried out by trimming the setting script (from the last post) to no longer publish the one to be removed, then use a GPP item to remove the key relating to the VPN proxy settings, to be removed.
For those other groups, within the organisation, who do not have such consistency; the results of the script can be analysed in Excel to see what is “out there”. Then another (not yet written script – because I don’t think I’ll need it) can be used to find those binary values which match those that are to be removed and then subsequently remove them.
A number of years ago, I rationalised the way that IE proxy settings were delivered to supported systems. Previously, this had been done via installation scripts, which were not enforced after application, so were prone to subsequent error.
To achieve consistency and ease of deployment Group Policy was used. Back then there were Internet Explorer Maintenance GPO settings, which allowed the LAN Proxy setting to be deployed easily. However, I also wanted to make consistent the delivery of VPN settings. This was also achieved by using Group Policy to deliver a VPN address book (rasphone.pbk) to the clients.
In order to get the correct proxy settings assigned to those VPN connections, I used a little scripting.
First, find out on my test machine what the text in the IE proxy GUI translated to, where I had already setup the VPN connections and set the correct proxy settings:
Then use the results of that to create a new vbs script to be pushed out as a User login script:
Part of the process of getting the script in the previous post working was having a record of who is licenced *now*.
This was done using the following script. Note that the commented lines must be un-commented to log the data, I am currently using this script to keep the AD groups up to date , hence the Add-ADGroupMember lines are uncommented. AD groups will soon be replacing the text files.
Since introducing O365 to my organisation, there has been a steady flow of people wanting to use services within it. The licences for which have been applied on an ad-hoc basis by several global admins.
Now we are in a position to make these services more official/ production, I needed to licence those users who were not yet licensed, plus sort out the improperly licensed people.
It seems that either one of those global admins who was assigning licences or maybe self registered users were getting the Information Worker licences – e.g. STANDARDWOFFPACK_IW_STUDENT. This was not desired, so my script had to sort those out as well!
Additionally, people whose status changed from being staff to anything else needed to have any staff licence revoked and replaced with a student licence – and vice versa.
I also wanted to control who should be given those individual licences, like Office Pro Plus and Dynamics. For the moment, those entitled to these licensed are held in text files – I’m working on migrating to using AD groups instead, but for now just needed something that just works!
The script logs added and removed licences, to C:\Office365-Scripts\Licencing\LicenceManagement.txt
Error emails are based on the content of the $error variable, using a function to generate the mail body.
The initial service certificate used for my ADFS service was relatively simplistic – containing only one UPN suffix.
Recently, the security team stated that they wished to start using InTune, via SCCM. Thus, I needed to enable the Workplace Join functionality on the ADFS server farm.
In order to do so, I obtained a new certificate with the additional SAN: enterpriseregistration.<MyDomain>, then I needed to replace the service certificate on the ADFS servers in the farm. This process was followed:
Install new certificate on all ADFS servers in the farm
Allow the ADFS service account read permissions on the private key
On the Primary ADFS server, set the new service certificate
Obtain the certificate thumbprint from the new certificate, e.g.: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
On all ADFS proxy servers – Set-WebApplicationProxySslCertificate -Thumbprint: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
On the Primary ADFS server – Set-AdfsSslCertificate -Thumbprint: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Restart ADFS service on all servers
8. Test, by doing a federated login via portal.microsoftonline.com
When I initially started delving into the management of my Office 365 tenancies, I was looking for a secure way of providing a username and password, with which to authenticate (connect) to MSOL.
I didn’t particularly like the idea of putting plain text usernames and passwords into my scripts. However, shortly after getting my ADFS servers up and running, I applied the Microsoft Office 365 Federation Metadata Update Automation Installation Tool. If you look at how this works, you see that it uses Credential Manager to store the username and password. It extracts these when needed to connect to MSOL. This seemed like a nice clean solution, so I “borrowed” the methodology for most of my O365 scripts.
Take note that the Windows Credential Manager is profile specific, so if you need a service account to run some of your scheduled tasks of the like; then you need to create the credential under that profile. If you ever want to change the password of the account stored in credential manager, you can change it via the GUI, but remember that you need to do it for every profile that might use it.
So, first we need to create the credential that we want to use in our scripts:
Note line 31 – $TargetName = “LicenceManagment” – this is the label of the “Credential”, this is what is used in the subsequent scripts to get the credential.
OK, so now we have the credential stored, lets test it – I often end up using this generic connection script, when I want to have a quick look at something in MSOL:
Thus, this Generic connection script is the basis for most of my management scripts, the rest is just bolted onto the bottom.
The HR data source, that I currently receive person data from, has historically had data quality issues. These are much better than they were in the past, but still cause a few issues.
When I attended FIM training at OCG, I raised the issue of data cleanliness and was told in simple terms – make sure the input data is clean! If only life was so simple…..
Back to reality, I have had to add code to my Advanced Flows to deal with, clean up and validate the input data.
A nice example follows – importing Surname from HR – dealing with:
Just plain bad data (null as a string/ value)
Validation (characters that should not be present – via regex replace)
Clean up (removing spaces from around hyphens – double barrelled names).- there is also a bit of trimming to remove and spaces before or after the string value
Things like this remind me of why “Codeless Provisioning” was something I fought to get working (for too long), but ultimately had to abandon in favour of using code for almost everything. Doing so has been a real panacea for all of the rules and other funnies that I have had to accommodate.
Note: I made a little edit – I was not checking for the presence of AccountName before raising errors – should that attribute have been missing (highly unlikely, but not unknown to occur), that would have raised an error in itself. The edited code is a little more robust!
A few days ago, I made a mistake while messing with my test set. I was getting bored of making myself drop in and out of a set (using the GUI) by changing my AccountName between my real account name and my AccountName plus another character. I was doing this to test my PowerShell workflow.
I was looking to use the Lithnet PowerShell cmdlet to add and remove me as a manual member. While trying to make this work, I removed my AccountName from the criteria based membership – thus the mistake was made!
I saw rather a lot of PowerShell workflows kicking off. Initially, I didn’t understand what was happening, until I looked at the requests and saw that each request was for a different person. By removing my AccountName from the criteria based membership, I had opened up the membership of my Test Set to “All Users”, thus all users fell into the set and kicked off a PS workflow. Bugger!
I found this: http://www.integrationtrench.com/2013/09/use-powershell-to-cancel-bunch-of-fim.html, which I’ll leave here for future reference. But in the end decided to just let the workflows progress, it was late in the day anyway, they would be done by the morning. A quick test of the commands in the referenced post indicated that it was going to take an age to get all of those requests the were in a Post Processing state.
Of course by the time it had got them all, many may have been processed by then anyway..
Lesson learnt: change the scope of a set carefully!
Now that my notifications are ready to go and there are a few real recipients (testers), I was considering what would happen when the FIMMA jobs were run – lots of emails!
The FIMMA had not been run properly for a few days – this is still a system in building – so there were a lot of attribute changes to export. Also, I’d had to put an MA back in temporarily, to fix up some bad end dates – I needed to fix precedence so that this was now lower – thus quite a few end dates had changed in the intervening period. That MA will be removed again shortly…
So, I needed to quickly disable all of the MPR’s that would trigger sending the emails. Again the Lithnet RMA was a great and quick solution!
Comment out/ uncomment out the true/ false lines as appropriate. I’ll need to re-enable all of these again once the FIMMA has run through.