Gist

Contains GitHub Gists

Backup and Clear Domain Controller Security Event Logs

A post related to https://blog.oholics.net/logparser-loves-security-logs/, for Case 3.

If you don’t manage security logs by regularly backing them up and clearing them, you risk losing important historical information. Additionally, running a LogParser query against a large, unmanaged security event log takes a long time.

The below script is designed to be run daily at the end of the day to backup the security event log on a Domain Controller and then clear its contents. Additionally, the logs are archived off to two windows shares to allow for long term storage.

The script makes use of Jaap Brasser’s DeleteOld script (https://gallery.technet.microsoft.com/scriptcenter/Delete-files-older-than-x-13b29c09) to carry out tidy up operations of the local staging folder. In practice, I used the same script to manage the archive folders too, keeping 365 days worth of logs.

Usage: .\BACKUP_AND_CLEAR_EVENTLOGS.ps1 <DomainController> $clear

Make sure that the security event log maximum size is increased to a high enough level to ensure that none of the days logs get overwritten. Judging that size will depend on the number of events per day or alternatively just set to “do not overwrite events”.

Note: the event ID’s are purely made up 😉

Enumerate Azure Role Assignments

The following script can be used to enumerate role assignments for a subscription and role assignments for Resource Groups within that subscription.

Use as-is to just grab everything – note 2 subscriptions are used in the example – fix the subscription GUID’s on lines 6 & 7.

Optionally un-comment the references to -SignInName “Jon@oholics.onmicrosoft.com” to obtain a report showing only those resources that refer to the named user.

The resulting report can be opened in Excel, to product a nice table 😉

LogParser Loves Security Logs

Just digging something up that I used to use regularly to look for logon events related to a certain username (samAccountName). Thought I’d regurgitate them here for “the next time..”

Three different SQL queries for three different use cases:

Case 1. I know that the logon event that I’m looking for occurred on DC01.oholics.net, I’m therefore going to interrogate the live DC log. The primary username I’m looking for is “jon”, a secondary name shown as “dave”. This could be replaced by a junk string if I’m only really looking for “jon”, or just trim the query (up to you.. ).

Case 2. In my domain, there are three domain controllers, I’m not sure where the logon events happened, so as in Case 1 I search the live DC logs, but this time searching all DC’s logs.

Case 3. I have three months of backed up logs to search through (in C:\TEMP\Logs) for all logon events for samAccount name “jon” (and optionally “dave”, as above). I may splurge out the script that I used to use to backup and clear the event logs next, that could be useful again – I’ve got to clean it first.

Usage: logparser -i:EVT file:<SQLFileName>.sql -o:CSV -resolveSIDs:ON 

Where:

  • The above SQL query is saved as LogParserRedaction.sql in the same location as the LogParser binary.
  • The collection of logs to be redacted are in C:\TEMP\Logs\
  • The output file will be written to C:\TEMP\Output\output.csv

Redacting sensitive content from Windows event logs using LogParser

Consider the scenario: opening a ticket within Azure for an issue with an infrastructure component or security event. IP addresses, domain names and Machine names are classed as sensitive and should not be revealed to MS support staff.

You have a folder filled with event logs from the problem machine(s). You need to redact the above mentioned properties.

By using LogParser with the following sql statement, a CSV file is exported which strips out the sensitive properties, replacing parts of the properties with X’s.

  • OHOLICS‘ is replaced by XXXXXXX‘ where it is found in an event log, in the Strings, ComputerName, Message or Data fields
  • The first two octets of an IP address are stripped, where these are ‘192.168.’ in the Strings, Message or Data fields
  • blog.oholics.net‘ is replaced by blog.XXXXXXX.net‘ where it is found in an event log, in the Strings, ComputerName, Message or Data fields

Note that after the output file is created, the header row will need to be updated to remove the replace statements. Where normally just the item name would be added as a header, the full replace query is added as the header for those items.

Usage: logparser -i:EVT file:LogParserRedaction.sql -o:CSV -resolveSIDs:ON 

Where:

  • The above SQL query is saved as LogParserRedaction.sql in the same location as the LogParser binary.
  • The collection of logs to be redacted are in C:\TEMP\Logs\
  • The output file will be written to C:\TEMP\Output\OUTPUT.CSV

Azure Service Principle Authentication

I have recently been working within a client where all Azure/ Office 365 users must perform MFA on logon.

Ages ago I posted about using credential manager to automate Office 365 scripts: https://blog.oholics.net/using-credential-manager-to-authenticate-office-365-scripts/. This method will clearly not suffice where MFA is enforced, as there is no mechanism to allow MFA challenge and response.

Recently I have been looking into using Azure Service Principle objects to bypass MFA and to allow scripts, that need to connect to Azure or other services, to do so without input. Thus, I can then schedule scripted tasks to generate reports on Azure AD objects or AzureRM items.

Firstly I need to create some certificates, these will be used to authenticate, see here: https://blog.oholics.net/creating-simple-ssl-certificates-for-server-authentication-using-openssl/ for details on certificate creation.

Next, once we have the PFX certificate file, we can create the Azure App Registration, using PowerShell:

Then (optionally), if the script that you want to automate will be reading AzureRM objects, run the following script. Note that if the role assignment is to be constrained to a specific resource group, add the  -ResourceGroupName switch to New-AzureRMRoleAssignment

Additionally, the RoleDefinitionName can be altered to suit.

Now we have the Service Principle in place, we can connect! But in the case of the Azure AD connection, I need to first allow the application to Read AAD:

Go to the App Registrations blade in Azure AD, pick the application created earlier, then select settings. Select Required Permissions, add Azure AD and add the permissions shown in the following image:

Now lets connect using the certificate thumbprint:

By installing the certificate in the CurrentUser store, only that user can consume the certificate thumbprint for authentication using this method. Lovely.. 🙂

Why is this method secure?

  • You can only access the application to sign in if you have installed the certificate on the machine that you want to run the script from.
  • To install the certificate, you must know the password that was set on the private key during PFX creation.
  • No AAD user object is created
  • No plain text passwords need to be stored

To sign in using this method, you must know:

  1. The AAD Tenant GUID
  2. The Application GUID of the configured application
  3. The specific thumbprint of the certificate, used to make the connection
  4. The certificate and private key must be installed on the machine on which the connection attempt is being made.

MIM PAM Automated Installation Script

I have been doing a fair bit of work with MIM PAM recently, finding a few issues. This has meant that I have re-installed the application (post-SharePoint Foundation), in my lab, a few times.

I was getting a little bored of clicking through the options, ticking boxes and refilling the URL’s etc. Then I spotted on the CD/ DVD, a batch file in the Service and Portal folder – called “Service and Portal_Reference_For_PAM_Install.bat“.

A quick look showed that this would automate MIM PAM installation. However, there was no documentation to go with it – notably to clarify which accounts were referred to by  “ADMIN_USER = Administrator” and “SYNC_ADMIN = FIMSyncAdministrator”. A quick google revealed no relevant results…. So, take a snapshot and start trying accounts….. Based on the MSI command run at the end of the script Admin User relates to the SERVICE_ACCOUNT_NAME. So, in my case that relates to the MIMService account.

Thus, my complete working script is as below. Note – my PAM domain is a sub-domain of oholics.net called “priv“, my MIM PAM server is called “mimpam

Note that the following lines will need to be amended – 7, 10, 17, 19, 20, 21, 22, 35, 54, 55, 62, 63, 70, 71.

Also, note the script assumes that you have a folder C:\Temp to write the log to – if you don’t you’ll get EXIT CODE: 1622

Nice bit of automation – run the file, then make coffee or whatever – certainly something fulfilling 🙂

A little update – the script in its current form is not perfect. Note that the MSI switches include: MAIL_SERVER=”%MACHINENAME%” and SQLSERVER_SERVER=”%MACHINENAME%” – Meaning that both attributes will be set to the local machine name. Set some more attributes and changes the MSI arg’s to suit.

Additionally, I have been testing the RESTful interface over the last few days and have seen some oddities – whether these are related to using this script to install is under investigation…..

Delegating Group Management – Using the Lithnet FIM PowerShell Module

Within my AD structure, group management is delegated within certain OU’s, I now need to replicate that functionality in the FIM portal.

The is no real way of identifying which groups should be managed by whom, except the OU within which the group currently resides.

So, to start off with I need to get the parent OU of the group into the portal:

Import the OU into the MV:

Setup an export flow for adOU into the portal.

Then, by using the Lithnet PowerShell Module, we can create all the sets and MPR’s required, below is a sample for creating one delegated “collection”. In production, my XML file is much bigger – delegating group management to around ten different groups.

Note, that you first need to create references to all users who might be given the rights to manage groups. This includes the FimServiceAdmin and FimServiceAccount – referenced by their ObjectID, the others are referenced by their AccountName. All members referenced in this section, are added to the __Set:GroupValidationBypassSet. This set is defined in the non-administrators set – not in this set – this bypasses the group validation workflow:

AllNonAdministratorsSet

Create a set of groups to be managed – the filter being the OU that the groups belong to & MembershipLocked=False

Create a set of administrators for this delegation – adding the explicit members

Then create the two MPR’s to allow the members of the administrative set to manage those groups – the first MPR allows modification (Read, Add and Remove) of the ExplicitMember attribute, while the second allows creation and deletion.

Use Import-RMConfig -File <PathToXML> -Preview -Verbose to validate your xml and see what it would do. Drop the “-Preview” to make the change

An Alternative To Using The Generic Array From File Function

While looking to improve on my method of getting exceptions or a long list of mail suffixes into an array, to be checked during code execution, I came across this: https://msdn.microsoft.com/en-us/library/windows/desktop/ms696048(v=vs.85).aspx

This seemed to me to be a really nice solution, just defining all exceptions and suffixes within one file, read it in on code execution, then check for existence or whatever in the code.

So, given the following xml file:

Add the System.Xml Import and declare the variables, so they are global:

Add the code to read the xml file into the Initialize Sub:

Then, when you wish to look for those values within those variables – just like in the last post: