Saturday, 27 August 2022

The 3 R's of Microsoft 365 Defender's Live Response

Introduction

Recently a client had quite an issue with DirectAccess.  I won't go into the specifics but I will say that, in addition to a number of remediation scripts that we needed to run using Intune - we did need to contact a significant number of users directly.  In these cases the user's device did have an issue contacting any of the on premise domain controllers. And because of this, some web based applications could not be accessed.

And we did need to initiate a remote viewing session with each user, either using QuickAssist or Teams.  With QuickAssist we would normally be able to execute commands or scripts with elevated privileges.  But in these cases this was not possible because no domain controller could be contacted to complete a Windows authorisation.  And in Teams we do have a remote sharing feature and also with the ability to take control of the user's desktop when required - but Teams is a collaboration application and not a remote troubleshooting tool - and there is not, as far as I know, any way of taking control with elevated privileges.

And so I thought the Live Response feature in the Microsoft 365 Defender cloud based solution might be the ticket.  And so it was.  You could almost say it was tailor made for our particular problem.

We had to achieve four things when we contacted each user:

1) Check child registry keys were cleared under HKLM:\Software\Policies\Microsoft\Windows NT\DNSClient\DnsPolicyConfig

2) Check the hosts file in c:\windows\system32\drivers\etc contained a specific IP address and URL

3) Run a PowerShell script if any of the above checks failed

4) Connect via a backup VPN solution and run gpupdate.

Using Teams or QuickAssist we could perform items 1 and 2 but not item 3.  And in any case often times the session would be slow and cumbersome.  Possibly we could guide the user to completing step 3 if they had local admin rights - and as a rule this was not the case.

And so the three R's of Live Response came to our rescue - Registry, Retrieve and Run.  Live response is a Remote Shell connection, giving us administrators a set of powerful commands for troubleshooting devices with elevated privileges.  And such troubleshooting requires no demands of the user - and if they can do some of their required work, issue permitting, then they can do so while the security administrator uses Live Response to troubleshoot and resolve the issue.  It sounds cool and it is cool, as we shall see.

Note: This article will not cover step 4 - connect via a backup VPN solution and run gpupdate

Allowing Live Response

We permit the use of Live Response in the Microsoft 365 Defender portal by navigating to Settings\Endpoints\Advanced features.  Turn on Allow users with appropriate RBAC permissions to investigate devices that they are authorized to access, using a remote shell connection.


In addition, if you need to run unsigned PowerShell scripts, turn on Live Response unsigned script execution.


Using Live Response

A Live Response remote shell is started by navigating, in the Defender Portal, to Devices.  In the Device Inventory blade you can type the name of the troubled device in the Search box


Click on the returned device and then click on the 3 ellipse points on the right hand side. Select Initiate Live Response Session from the drop down list.


Your command console sessions is then established.



Registry

The first R of our list of requirements, as stated above, is to check a particular registry key in the HKLM hive.  We do this by running the registry command with the path to the key.

registry "HKEY_LOCAL_MACHINE\Software\Policies\Microsoft\Windows NT\DNSClient\DNSPolicyConfig"





After a few seconds the results are returned.




There are clearly many child keys under the DnsPolicyConfig key, and so this particular device does not pass the first of our two checks.

Retrieve

The second check, as detailed above, was to examine the hosts file.  For this we have the getfile command:

getfile c:\Windows\system32\drivers\etc\hosts



We press enter and after a few seconds the hosts file is copied to our Downloads directory for us to examine.



A quick examination confirmed this file to contain the required IP address and URL.

Run

In this particular example, one of our two checks failed - the hosts file checked OK but the DNSPolicyConfig key contained child keys that we need to remove - for the purpose of our DA issue.  

To do this I need to do the following.

1) Upload my script, called cleanDNS.ps1 to the Live Response library.

2) Run the following command in the command shell: run cleanDNS.ps1

Uploading the Script

This is done by clicking on Upload file to library, which is located on the right hand side above the command shell.


The next steps are fairly intuitive.  Click on Choose File and browse to and select the PowerShell script to run,  I chose my CleanDNS.ps1 file.


The next step is to click on Confirm.

Note:  The cleanDNS.ps1 script is very simple and contains the following command - Remove-Item -path "HKLM:\Software\Policies\Microsoft\Windows NT\DNSClient\DnsPolicyConfig\*" -recurse

Run cleanDNS.ps1

Now we can run our script using the run <name of script> command.


So we do type in run cleanDNS.ps1 and click on Enter. The script successfully executes.


Note:  The name of the file is case sensitive.

We now have remediated the registry issue.  For this real world example, the next step was to connect the device using the backup VPN solution and run a gpupdate, which would populate the DNSPolicyConfig key with the required child keys, properties and values.  After a reboot the DA feature was once more working and the user could access their required internal web based applications,

I hope you enjoyed reading this article and I wish you as much success with your testing of the Live Response feature in Microsoft 365 Defender

Colin






Wednesday, 3 August 2022

Lenovo WarrantyLookup BatchQuery and MECM Data

Introduction

MECM is a very powerful management system and does collect a lot of data about devices, and this means I often get asked for information about devices.  One manager might want to know which devices have Firefox installed and another may need to know which devices are low on disk space.  There is so much information in the database and, in most cases, I'm able to whip up a query and deliver on the data request within the hour, workload permitting.

A few weeks ago I got asked about warranty information for laptops.  Which laptops are out of warranty and which laptops will soon be out of warranty.  I was able to produce some reports based on the date the devices were registered in MECM, but this was only an estimation.

1) A machine may have been provisioned months after purchased - and the warranty period starts from the date of purchase or delivery, not the provisioning date.

2) A machine may have been returned for a rebuild - and in this situation the estimation could be hugely inaccurate.

3) A machine entry in the MECM database may have been deleted, for whatever reason, and the client reinstalled and re-registered.

And then I discovered the UK Lenovo Warranty Lookup page at:

https://pcsupport.lenovo.com/uk/en/warrantylookup#/

And then I discovered the Run batch query page at:

https://pcsupport.lenovo.com/uk/en/warrantylookup/batchquery

I knew then I had the means to supply this manager with exactly the information he required for all the Lenovo devices.

In this article I will describe how I completed this request. I will cover the following steps:

1) Download the Lenovo Template.

2) Populate the Lenovo Template.

3) Submit the data and retrieve the warranty information.

4) Retrieve additional data from a MECM query.

5) Create a database and import the required information - this is to add machine name and other information to the warranty data returned from Lenovo.

6) Create a SQL query to connect the Lenovo data with the MECM data.

Download the Lenovo Template

If you are in the UK navigate to:

https://pcsupport.lenovo.com/uk/en/warrantylookup/batchquery and click on Download the Latest Template


An excel files called Warranty_Batch_Lookup_Template.xlsx will be downloaded.  This file contains the columns that need to be populated with information from our MECM database.  At the time of writing we can see that we need the devices' serial numbers and model numbers.



Save this file to a convenient location and provide it with a relevant name such as LenovoUpload.xlsx.

Populate the Lenovo Template

Next, we need to create a query in the MECM console.  This will allow us to populate the Excel template file downloaded in the previous step.  Thus we need to create a query to retrieve the model number and the serial number.  Open the MECM console and navigate to Monitoring\Queries.  Right click and select Create Query.  Enter in a name such as Lenovo Batch Info.


Click on Edit Query Statement.




Click on Show Query Language and copy the following query statement into Query Statement area.

select SMS_G_System_COMPUTER_SYSTEM.Model, SMS_G_System_PC_BIOS.SerialNumber from  SMS_R_System inner join SMS_G_System_PC_BIOS on SMS_G_System_PC_BIOS.ResourceID = SMS_R_System.ResourceId inner join SMS_G_System_COMPUTER_SYSTEM on SMS_G_System_COMPUTER_SYSTEM.ResourceID = SMS_R_System.ResourceId inner join SMS_G_System_OPERATING_SYSTEM on SMS_G_System_OPERATING_SYSTEM.ResourceId = SMS_R_System.ResourceId where SMS_G_System_PC_BIOS.Manufacturer = "LENOVO" and SMS_G_System_OPERATING_SYSTEM.Caption like "Microsoft Windows 10%"



Click on OK and Next to create the query.




After creating this query, run it to retrieve the results.  Press CTRL A and then CTRL C to select the results and copy the results.

Open the previously created template spreadsheet and paste (CTRL V) the results into the spreadsheet.  Save the spreadsheet.




Submit the data and retrieve the warranty information.

Having retrieved the serial and model numbers from MECM and copied them into the template spreadsheet, we can now upload them to the Lenovo portal and retrieve the desired warranty information.

Navigate to the following URL:

https://pcsupport.lenovo.com/gb/en/warrantylookup/batchquery

Click on Browse and select the previously populated and saved spreadsheet.


Click on Submit to download the required warranty information




After you have had a good look at all this useful warranty information in Excel, save it as a .csv file.  In my case I save it as lenovo.csv.  We will need this file later when we import the data into our custom database.

Retrieve additional data from a MECM query

And this information is what we require, but of course it does not contain the netbios machine names - because this is provided by the engineer after it is purchased.  It does contain the serial numbers however.  And in MECM we have the serial number and the machine name and we need some way to connect the machine name to the warranty information.  We could do this manually but that is not realistic, given there maybe hundreds or thousands of machines.

Note:  The template upload file also has a comments column.  Another way of achieving this is to populate the comments column with the netbios name or any other information you require in the report.  This would be an easier approach if you are more confident working with MECM queries and you are sure exactly what information you would like to attach to the warranty information, before uploading the template spreadsheet.

We need to create a query in MECM that returns at least the netbios names and the serial numbers.  So let us create such a query, in the same way that we created the last MECM query for the upload.  I call this query Basic Information and it has the following SQL query.

select SMS_R_System.NetbiosName, SMS_G_System_PC_BIOS.SerialNumber from  SMS_R_System inner join SMS_G_System_PC_BIOS on SMS_G_System_PC_BIOS.ResourceId = SMS_R_System.ResourceId


After running this query, press CTRL A and then CTRL C and paste the data into an Excel file.  The column headings I added to my spreadsheet were name and serial.



Save this spreadsheet as a .csv file called mecm.csv.  We now have two files called lenovo.csv and mecm.csv.  These will be used in the next section.  The data in these two files will be imported into our custom database.

Create a database and import the information 

And now we have all the data required for our warranty and in two files.  You could hand them over to your database administrator and ask kindly for him or her to do their magic and combine the netbios names in the mecm.csv file with all the warranty data in the lenovo.csv file - but where is the fun in that?

We can do it ourselves.  You might prefer to use an Access database, but here I use the SQL database on my test server.

I open up the Microsoft SQL Server Management Studio and right click on Databases and select New Database.  In the Database name field I enter in WarrantyLookup and click on OK.


The database is created and that is how simple such a thing is.  Now we need to import the lenovo.csv and the mecm.csv files into the database - each file will be imported into its own table.

Navigate to the WarrantyLookup database in the SQL Server Management Studio and right click and select Tasks and then Import Flat File.



Click Next at the Introduction window of the wizard.  On the Specify Input File windows browse to the the mecm.csv file.  You can leave the default table name or change it to mecm to match our example.



Click Next on the Preview Data page.  On the Modify Columns page confirm you are happy with the column names.


Click on Next and then Finish and Close.  We can see that the mecm table is created.


Repeat all of the above steps in this section to import the lenovo.csv file.  Your new database will now contain two tables with the information we need to produce our final report.




 Create a SQL query to connect the Lenovo data with the MECM data

Now we are ready to combine the data from both the tables we created in the previous section.  In the SQL Management Studio  navigate to the WarrantyLookup database and click on New Query, in the upper ribbon.  In query window enter in the following SQL query.

select * from 

mecm me

inner join lenovo le

on le.serial = me.serial

After pressing Execute our required data will be returned.




This data can be copied and pasted into an Excel spreadsheet (CTRL A and CTRL C and CTRL V) and filtered according to your requirements. 

I hope you enjoyed reading this article and I hope your manager appreciates your hard work when you present to him or her, the warranty data relevant to your environment.

Colin
















Friday, 8 July 2022

DP Content Migration - CM 2207 Preview

Introduction

Are you needing to replace a Distribution Point server with another server? Are you concerned that your Cloud Distribution Point (CDP), hosted on Azure classic services, needs a content transfer to another distribution point - because it will be deprecated in 2024?  Or do you need your package content, hosted on your CMG v1, migrated to another distribution point because of the coming deprecation of CMG on *.cloudapp.net?  

The Start-CMDistributionPointMigration PowerShell cmdlet, released in CM 2207 preview edition, will help you with any of the above migration needs.  The new distribution point server, or the distribution point server receiving the package/application content will need to be up and running, before you run this migration cmdlet.

In this post I will test a simple scenario. The existing DP, whose netbios name is server3, has one legacy package and one application distributed to it.  I have created a new DP on my DC, called server1, and have not deployed any packages or applications to it.  I will run the Start-CMDistributionPointMigration cmdlet to start the content migration. I will run the Get-CMDistributionPointMigrationStatus cmdlet to determine when the migration has completed.

The Application and the Package

I have created a PowerShell Deployment Toolkit application to deploy the Microsoft Account Lockout Status tool.  As can be seen it is only distributed to one DP.


I have deployed the User State Migration Tool (USMT) legacy package to the same DP.



The PowerShell DP Migration Commands

In the SCCM console I click on the top left hand down arrow and select Connect via Windows PowerShell.




In the Powershell console I enter the following command:

start-cmdistributionpointmigration -sourcedistributionpointname server3.domain1.lab.tst -destinationdistributionpointname server1.domain1.lab.tst



After pressing Enter the migration begins.

Monitoring the DP Content Migration

In the SCCM console I can see that the new DP has been added to the USMT legacy package and that the distribution is in progress.



I can see also that the additional DP has been added to my Account Lockout Status application and that it has distributed.



Twice I run the following PowerShell monitoring command:

get-cmdistributionpointmigrationstatus -sourcedistributionpointname server3.domain1.lab.tst -destinationdistributionpointname server1.domain1.lab.tst

The first time I run the command I can see that the migration is in progress.  A few minutes later I run it again and confirm that the migration has completed.


Conclusion

In the past I would have achieved a similar operation by writing a vbs script with dozens of lines and some complex logic.  But now, as we have seen, a one line PowerShell command will achieve the package and application content migration from one DP to another.  I hope you enjoyed this article and I wish you similar testing success in your SCCM Technical Preview 2207 environment.













Thursday, 26 May 2022

Autopilot and the User Driven Profile with VPN: Hyper-V lab

Introduction

Autopilot really is one of Intune's great features.  A device can be shipped directly from the OEM to a user and, providing the user has an internet connection, all she or he needs to do is power up the device and sign into the device.  Autopilot takes over and completes the required build, enrolling the device to Azure Active Directory and installing the required applications and configuration profiles.  For this scenario there is no requirement for any on premise infrastructure.

And if the user's organisation requires the device to be also joined to an on-premise Active Directory - this too can be handled by Autopilot.  At some point however the device needs to communicate with an on-premise domain controller.  Thus here we have a requirement for the device to undergo at least part of the Autopilot operation while connected to the on-premise infrastructure.  This does involve additional costs for an organisation, having to maintain for instance a remote LAN at a provisioning partner's premise.  This can be avoided if the Intune Connector for Active Directory is installed on an on-premise server: this enables a domain join without any connection to the corporate network. But the device's designated user will need to travel into a location where they can sign into the device, and be authenticated by an on-premise domain controller. Microsoft refer to these situations with the term - Hybrid Azure AD joined profile.

The User Driven with VPN profile, in preview at the time of writing, gets around these requirements using, as the name suggests, a VPN connector so the user can sign into the device, cache his or her credentials, and without having to visit an office location.

In this article I demonstrate how to test this feature in a Hyper-V lab environment using only the Windows native VPN and PPTP tunnelling.  It is not a start to finish set of steps - it begins from an assumption that the engineer already has the Hybrid Azure AD joined Autopilot scenario working.

I will cover:

1) A Brief Overview of my environment.

2) The Hyper-V Virtual Switch requirements.

3) The Routing and Remote Access configuration.

4) Creating a native Windows VPN connection.

5) Creating an Extensible Authentication Protocol (EAP) xml file.

6) Creating the Autopilot VPN profile.

7) The User Driven Deployment Mode profile configuration.

8) The Domain Join profile configuration.

A Brief Overview of my Lab Environment

For this exercise the lab environment consists only of two virtual devices.  

One is the DC Controller and it has installed on it the following: DHCP and a 10.11.12.0/24 scope, the Intune Connector for Active Directory and the Routing and Remote Access Server feature.  There are two Network Adapters attached to this device - one for internet connectivity and one for the internal AD environment.  It has the Windows Server 2019 Operating System installed on it.  Its NetBIOS name is Server1

The second device is the workstation that will receive the Autopilot deployment.  It has only one network adapter and this is connected to the internet.  There is no way for this device to connect to the internal Active Directory domain apart from via an VPN connection.  

The workstation OS will be Windows 10 with the 21H2 feature release.  It has two virtual processes and 5GB of ram.  Its NetBIOS name is determined by the Intune Domain Join profile and is prefixed by MDCS followed by a series of random characters - to be detailed later in this article.

Important: The autopilot recipient needs to have >4GB ram and it needs to have two virtual processors assigned.  This is to address an issue outlined in the following article

https://docs.microsoft.com/en-us/mem/autopilot/known-issues

Namely: "Virtual machine failing at "Preparing your device for mobile management"

The Hyper-V Virtual Switches

I have two Virtual switches configured in my lab for this exercise.  There is an internet facing switch that connects to the ethernet adapter in my laptop (the host) and this connects to my broadband router.  The name of this switch is: Lab1ExternalEthernet.  Its network ID is 192.168.1.0/24.


Each of the two virtual devices have a network adapter attached to this switch.

The other virtual switch is a private network and is named Lab1PrivateNW.  Its network ID is 10.11.12.0/24 and it represents an on premise network.  Only the domain controller (server1) has a network adapter connected to this switch.


The Routing and Remote Access configuration

It will be a collaborative task to determine your corporation's security and VPN requirements and there are multitudes of options to consider.  Here I present the configuration for a PPTP VPN approach that worked in my Hyper-V lab.  You can copy this by installing the RRAS feature, using your preferred method - and then opening the RRAS console.  Right click on your server and select Configure the Routing and Remote Access Server.  You are then presented with various options and my case I selected the Remote access (dial-up or VPN) option.  Here is an overview of my configuration.

If I select the properties of my RRAS server you can see I have enabled an IPv4 Router for LAN and demand-dial routing.


On the IPv4 tab I have enabled IPv4 forwarding and selected the option to allow the DHCP service to delivered our internal IP addresses.





On the Security tab and clicking on Authentication methods I have selected EAP, MS-CHAP v2 and CHAP.



Launching the Network Policy Server console by right clicking on the Remote Access Logging and Policies node I have modified the existing default policy to Grant Access.  


Note:  In my case I chose to use the PPTP protocol for my client VPN connections really to keep it as simple as possible, and removing a dependency on certificates.  


Creating a native Windows VPN connection

Having configured the server side of the VPN connection, naturally we want to test that it works.  And in fact, as we shall see later, we really need to do this in order to produce an EAP.xml file.

If you are using Hyper-V and are using a similar configuration to mine, then you can test it by creating a VPN on your host laptop - following these steps if you are using Windows 10.

1) From Settings navigate to Network & Internet and then to VPN

 
Click on Add a VPN connection.  For the VPN provider select Windows (built-in).  Enter a Connection name - here I enter in domain1.  Enter in the server ip address - in my case it is 192.168.1.66.  For the VPN type field enter in Point to Point Tunnelling Protocol (PPTP).  For the Type-of sign-in info field I select Username and password.  I leave the Username and Password fields empty and click on Save.



The VPN domain1 connection now appears.




When I click on the domain1 connection I have the option to click on Connect - which is what I need to do.  I am then requested to enter in my credentials after being informed that the blank credentials have failed.


After entering in the credentials I am connected - a good sign and reassurance that all is working well.



Creating an Extensible Authentication Protocol (EAP) xml file

An Extensible Authentication Protocol (EAP) configuration xml file is required when we create the Intune VPN connection profile.  Thus, having now created the VPN connection we can use PowerShell to produce the xml file.  Here is the process:

1) Open a command prompt and enter PowerShell - powershell.exe -executionpolicy unrestricted
2) Type $a=get-vpnconnection -name <name of your VPN connection> and press Enter.
3) Type $a.eapconfigxmlstream.innerxml >> c:\<folder name>\<name of xml file and press Enter




The EAP.xml content is now ready to load into the Intune VPN profile properties.


<EapHostConfig xmlns="http://www.microsoft.com/provisioning/EapHostConfig"><EapMethod><Type xmlns="http://www.microsoft.com/provisioning/EapCommon">26</Type><VendorId xmlns="http://www.microsoft.com/provisioning/EapCommon">0</VendorId><VendorType xmlns="http://www.microsoft.com/provisioning/EapCommon">0</VendorType><AuthorId xmlns="http://www.microsoft.com/provisioning/EapCommon">0</AuthorId></EapMethod><Config xmlns="http://www.microsoft.com/provisioning/EapHostConfig"><Eap xmlns="http://www.microsoft.com/provisioning/BaseEapConnectionPropertiesV1"><Type>26</Type><EapType xmlns="http://www.microsoft.com/provisioning/MsChapV2ConnectionPropertiesV1"><UseWinLogonCredentials>false</UseWinLogonCredentials></EapType></Eap></Config></EapHostConfig>

Creating the Autopilot VPN profile

Now that we have the EAP.xml content we are able to create the VPN profile in Intune. This is done by navigating to Devices\Windows\ Configuration Profiles in the Microsoft Endpoint Manager admin center (Intune Portal).  The platform will need to be Windows 10 and later and the Profile type will be Templates.  The Template name is VPN.


My configuration is as follows:

Use this VPN profile with a user/device scope: Device

Connection Type: PPTP (Native type)



I then configure the Base VPN as follows:

Connection name: Domain1

Servers: IP address on external Hyper-V switch, which is 192.168.1.66

Register IP addresses with internal DNA: Enable

Always On: Enable

Remember credentials at each logon: Enable

Authentication method: Derived credential



Open the EAP xml file, as detailed in the previous section, with notepad, and copy the contents into the EAP XML box.


Be sure to save this profile and assign it to the group to which your Autopilot Deployment profile is also assigned to.

The User Driven Deployment Mode Profile

Your User Driven profile with the Hybrid Azure AD joined setting will need to be modified.  The User Driven with VPN Autopilot deployment is designed to handle a situation without direct connectivity to an on site domain controller.  Therefore we have to set the Skip AD connectivity check (preview) setting to Yes - otherwise this will fail without a corporate connection in place.


Note:  If you have an Enrolment Status Page profile configured - you may need to create a custom OMA-URI policy if the ESP freezes on Joining you organization's network (Working on it...).

See the following article:

https://community.spiceworks.com/topic/2196102-windows-autopilot-hybrid-domain-join-not-fully-working.

The Domain Join profile configuration

In this configuration the option of selecting to Apply device name template is not available in the Windows Autopilot deployment profile.  Therefore, if we desire to use such a template - we can do so in the Domain Join profile.  In my profile I specify a prefix of MDMCS.  Intune then completes the NetBIOS name by adding random characters so that the complete name is 15 characters long.



The AutoPilot User Driven with VPN Experience

Now that we have all our requirements in place, let's look at what this type of Autopilot deployment looks like in practise.  We start with Welcome To  <Name of Tenant> sign in page.


After entering in the account name we are prompted for a password.


We may then be prompted to approve with out Microsoft Authenticator app on our phone.






We are asked to Please Wait..



Note:  If the build sticks on this step check that your Intune ODJConnector Service is running on your server.
 
The machine joins the on premise Active Directory.



The machine then reboots



Autopilot reboots the device and we have the sign in screen.


Clicking on the VPN Network sign-in icon we can enter in our credentials.









In the RRAS console we verify the VPN connection is established.



Windows completes the installation.



Verifying the VPN Connection

When the Windows installation completes we can verify VPN has been configured and is started in Settings\Network and Internet\VPN





MECM with EHTTP and HSTS enabled on a DP

Introduction Recently a penetration scan was done on a client's Microsoft Endpoint Configuration Manager's (MECM) environment.  The ...