Andy Dawson's Blog

The blog of Andy dawson

Azure AD Connect–Upgrade to 1.1.533.0 and Change of Source Anchor to mS-DS-ConsistencyGuid

As I blogged yesterday, I upgraded our instance of Azure AD Connect to what was, at the time, the latest version, 1.1.524.0. Subsequently, Microsoft Security Advisory 4033453 was published indicating that an upgrade to version 1.1.533.0 was very strongly recommended.

As before, the upgrade went smoothly, however there were a couple of additional points of note during the upgrade:

  1. Running the Azure AD Connect msi gave the following warning (note that I appended the version number to the file name in this example):
    Azure AD Connect 1.1.533 SmartScreen Warning
    I’m assuming that this will be fixed shortly Smile
  2. Once the upgrade was complete, the following warning was shown:
    Source Anchor Using objectGUID
    ’Azure Active Directory is configured to use AD attribute objectGUID as the source anchor attribute. It is strongly recommended that you let Azure manage the source anchor for you. Please run the wizard again and select Configure Source Anchor.
    Re-running the wizard and selecting the ‘Configure Source Anchor’ task allowed Azure AD Connect to pick ‘mS-DS-ConsistencyGuid’ as the source anchor, and all configuration occurs automatically. At the end of the process however another warning is shown indicating that if ADFS is managed externally to Azure AD Connect, then claim rule changes are required to align the new Source Anchor with the value returned and users may not be able to log in unless these changes are made.
    In our case, this means that changes need to be made to the ADFS rules for the Office 365 relying party trust.. To make these changes, the following steps were taken:
    1. On the ADFS Server, expand ADFS, then Trust Relationships, then click on Relying Party Trusts. Right-click the ‘Microsoft Office 365 Identity Platform’ and select ‘Edit Claim Rules…’:
      O365 Relying Party Trust
    2. Select rule 1 and click the ‘Edit Rule…’ button.
    3. The original rule was:
      c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname"]
        => issue(store = "Active Directory", types = ("http://schemas.xmlsoap.org/claims/UPN", "http://schemas.microsoft.com/LiveID/Federation/2008/05/ImmutableID"), query = "samAccountName={0};userPrincipalName,objectGUID;{1}", param = regexreplace(c.Value, "(?<domain>[^\\]+)\\(?<user>.+)", "${user}"), param = c.Value);
      The only change that was required was to change objectGUID to mS-DS-ConsistencyGuid, I.e.
      c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname"]
        => issue(store = "Active Directory", types = ("http://schemas.xmlsoap.org/claims/UPN", "http://schemas.microsoft.com/LiveID/Federation/2008/05/ImmutableID"), query = "samAccountName={0};userPrincipalName,mS-DS-ConsistencyGuid;{1}", param = regexreplace(c.Value, "(?<domain>[^\\]+)\\(?<user>.+)", "${user}"), param = c.Value);
    4. Save the rule and double-check that you can authenticate to Office 365.

Some background to the issue of ImmutableID and the value to select for Source Anchor for Office 365 can be found at https://blog.msresource.net/2015/05/20/revisiting-the-microsoft-online-immutable-id-design-decision/

Azure AD Self-Service Password Reset Issues

We recently saw an issue with Azure AD self-service password reset (SSPR). It’s been working fine for us for ages, ever since we first configured it using DirSync, but recently users started seeing the following message:

Please Contact Your Admin

Get back into your account

Please contact your admin

We’ve detected that your user account password is not managed by Microsoft. As a result, we are unable to automatically reset your password.

You will need to contact your admin or helpdesk for any further assistance.

As we’d made no changes, we were obviously concerned!

Initially I took the following steps to try and resolve the issue:

  1. Ensured that the OS patch levels of the servers (Azure AD Connect, ADFS, WAP) were up-to-date, which they were.
  2. Upgraded Azure AD Connect to the most recent version. The version we were running was a little behind, but not significantly so. During the upgrade process, the wizard takes you through what you’d normally see if you reconfigure Azure AD Connect and select the ‘customize synchronization options’ task. The optional features selected were still the same as we’d picked the previous time we’d upgraded, and included ‘password writeback’.

Unfortunately none of the steps taken above made any difference.

Looking in the configuration page for Azure AD in the old portal, I noticed that the ‘Password write back service status’ was still set to ‘Not configured’:

Password Write Back Service Status Not Configured 

Which, bearing in mind I’d just upgraded Azure AD Connect and been through the configuration wizard and seen that this option was ticked, should not as far as I was concerned be the case.

To correct the issue therefore, I took the following steps:

  1. Launched the configuration of Azure AD Connect and selected the ‘customize synchronization options’ task.
  2. When presented with the optional features configuration page of the wizard, unticked the ‘password writeback’ option and then completed the configuration.
  3. Repeated the above steps, but this time ensured that the ‘password writeback’ option was ticked:
    Azure AD Connect Password Writeback Config Option

Checking the configuration page in the old Azure Portal again, the status of the ‘Password write back service’ is now ‘Configured’ and the correct SSPR prompts are again being displayed to users.

Book Review: Windows Virus and Malware Troubleshooting by Andrew Bettany and Mike Halsey

Summary: A very useful volume that discusses what malware is, how to defend against it and how to remove it. Clear and simple instructions are given on ways to improve security on your PC, as well as how to deal with malware that may end up on your PC. Recommended.

Presented in a very easy to read writing style, this book immediately appeals due to the clear, concise and no-nonsense approach taken when discussing malware, what it is, how it can attack and affect your PC, how to defend against it and what to do if the worst should happen and your PC gets infected.

The first chapter provides a nice potted history of viruses and malware on PCs, discussing the various types and how both the proliferation and seriousness of infections has risen from the very first, typically benign examples to the modern day infections such as ransomware that has been in the news so much recently.

Chapter 2 deals with prevention and defence, and introduces the many security features that are built into modern versions of Microsoft Windows to help stop the initial infection. There’s a clear progression in security features as newer versions of Windows have been introduced, and it’s interesting to compare the versions of Windows that were most susceptible to the recent ‘WannaCry’ ransomware attack. Looking at the features discussed (and having been to a few presentations on the subject), this provides an excellent set of reasons for an upgrade to Windows 10 if you’ve not already done so!

Chapter 3 discusses defence in depth and includes information on firewalls, including the Windows firewall, as well as organisational firewalls (I.e. hardware firewalls and appliances) and how to generate a multi-layer defence. While at first glance this section appears to be more targeted at the organisational user, it’s actually also targeted at the home user with a hardware router/firewall combination, and some clarification that this is the case would, I feel, have been useful here. This chapter also bizarrely includes a section on keylogging software, which I feel would have been more useful in the first chapter

This chapter also provides some information on blacklists and whitelists (I.e. internet filtering) and the Internet of Things (IoT). For both of these sections I feel that there’s perhaps been a bit of a lost opportunity, for example a brief discussion of the filtering options available might have been helpful for home users (e.g. my Netgear router at home comes complete with an OpenDNS-based filtering option that can be enabled and configured quickly and easily and seems to provide reasonable protection) and further information on IoT security recommendations, particularly changing the default username and password on devices would be beneficial here.

Chapter 4 deals with identifying attacks starting with how malware infects a PC and providing pointers on how to identify both internal and external attacks. I was very pleased in this section to see information on social engineering and the role that this plays in malware infections.

Chapter 5 provides a very useful list of external resources that can be utilised to help protect your PC and clean a malware infection, including the Microsoft Malware Protection Center, a great location for finding updates, additional security recommendations and products etc. This chapter also provides some limited information on third-party tools that are available. Again, I would have liked to see a more expansive list here, and it’s worth mentioning that many anti-virus vendors provide a free option of their products.

Chapter 6 deals with manually removing malware, and for me this was probably the most useful part of this book. What do you do when malware has ended up on your PC despite your best efforts and you’re now having issues running the automated tools to get rid if it? This chapter helps in this scenario, and provides some steps to take to identify what’s running on the PC, suspend and/or kill the process and remove the infection. In particular I’m pleased to see the Microsoft Sysinternals tools discussed (albeit briefly) as they are my ‘go to’ toolset when dealing with an infection on a PC. If you’re interested in these and how they can be used, it’s worth looking at some of Mark Russinovich'sCase of the Unexplained’ videos as Mark goes through the use of these tools in more detail.

There are one or two downsides; the book is only a slim volume. This has both plusses and minuses insofar as being slim, more people are likely to read it end-to-end and therefore benefit the most from it, however in one or two areas a few more details might be appreciated. For such a slim volume, it’s also more expensive than I would hope for at an RRP of £14.99, which may limit its take-up.

All in all however this is a very easily accessible book that provides great guidance on how to secure your PC, what to watch out for and how to deal with a malware infection. I’ll be encouraging a few people I know to buy a copy and read it!

Title: Windows Virus and Malware Troubleshooting
Author(s): Andrew Bettany, MVP and Mike Halsey, MVP
Publisher: Apress
ISBN-13: 978-1-4842-2606-3

Test-SPContentDatabase False Positive

I was recently performing a SharePoint 2013 to 2016 farm upgrade and noticed an interesting issue when performing tests on content databases to be migrated to the new system.

As part of the migration of a content database, it’s usual to perform a ‘Test-SPContentDatabase’ operation against each database before attaching it to the web application. On the farm that I was migrating, I got mixed responses to the operation, with some databases passing the check successfully and others giving the following error:

PS C:\> Test-SPContentDatabase SharePoint_Content_Share_Site1

Category        : Configuration
Error           : False
UpgradeBlocking : False
Message         : The [Share WebSite] web application is configured with
                  claims authentication mode however the content database you
                  are trying to attach is intended to be used against a
                  windows classic authentication mode.
Remedy          : There is an inconsistency between the authentication mode of
                  target web application and the source web application.
                  Ensure that the authentication mode setting in upgraded web
                  application is the same as what you had in previous
                  SharePoint 2010 web application. Refer to the link
                  "
http://go.microsoft.com/fwlink/?LinkId=236865" for more
                  information.
Locations       :

This was interesting as all of the databases were attached to the same content web application, and had been created on the current system (I.e. not migrated to it from an earlier version of SharePoint) and therefore should all have been in claims authentication mode. Of note also is the reference to SharePoint 2010 in the error message, I guess the cmdlet hasn’t been updated in a while…

After a bit of digging, it turned out that the databases that threw the error when tested had all been created and some initial configuration applied, but nothing more. Looking into the configuration, there were no users granted permissions to the site (except for the default admin user accounts that had been added as the primary and secondary site collection administrators when the site collection had been created), but an Active Directory group had also been given site collection administrator permissions.

A quick peek at the UserInfo table for the database concerned revealed the following (the screen shot below is from a test system used to replicate the issue):

UserInfo Table

The tp_Login entry highlighted corresponds to the Active Directory group that had been added as a site collection administrator.

Looking at Trevor Seward’s blog post ‘Test-SPContentDatabase Classic to Claims Conversion’ blog post showed what was happening. When the Test-SPContentDatabase cmdlet runs, it’s looking for the first entry in the UserInfo table that matches the following rule:

  • tp_IsActive = 1 AND
  • tp_SiteAdmin = 1 AND
  • tp_Deleted = 0 AND
  • tp_Login not LIKE ‘I:%’

In our case, having an Active Directory Group assigned as a site collection administrator matched this set of rules exactly, therefore the query returned a result and hence the message was being displayed, even though the database was indeed configured for claims authentication rather than classic mode authentication.

For the organisation concerned, having an Active Directory domain configured as the site collection administrator for some of their site collections makes sense, so they’ll likely experience the same message next time they upgrade. Obviously in this case it was a false positive and could safely be ignored, and indeed attaching the databases that threw the error to a 2016 web application didn’t generate any issues.

Steps to reproduce:

  1. Create a new content database (to keep everything we’re going to test out of the way).
  2. Create a new site collection in the new database adding site collection administrators as normal.
  3. Add a domain group to the list of site collection administrators.
  4. Run the Test-SPContentDatabase cmdlet against the new database.

Enumerating BizTalk 2016 Features for a Command-Line Installation

As with previous versions of BizTalk Server, you can perform the installation using the GUI or a command-line. To use the command-line installation, you’ll need the list of features that can be installed to add to the /AddLocal command-line. The available documentation for a silent installation of BizTalk Server at https://msdn.microsoft.com/en-us/library/jj248690.aspx relate to BizTalk Server 2013 and 2013 R2 (see https://msdn.microsoft.com/en-us/library/mt743078.aspx for ‘BizTalk Server 2016: What’s new, and installation’, then follow the link Appendix A: Silent installation near the bottom of the navigation menu at the left to get to the above page); there’s nothing that I’ve found so far that provides the setup.exe command line switches, or a list of features for use in a silent installation specifically for BizTalk Server 2016. Note that blindly following the previous guidance and using certain specific /AddLocal features results in an installation failure!

Getting hold of the command-line parameters for setup.exe is, of course, simple. Just run setup.exe with the ‘/?’ switch from a command prompt to get the following:

Command Description
/help or /? or /h Help and quick reference option.
/s <Configuration XML file> Silent Installation of features found in Configuration file.
/passive Passive Installation. Only progress bar will be displayed.
/norestart Supress restart.
/forcerestart Always restart after installation.
/promptrestart Prompts before restarting. This option cannot be used with the /quiet option.
/x or /uninstall Uninstalls the product.
/L <Logfile> Writes logging information into a logfile at the specified path. Always uses verbose MSI logging and appends to existing file.
/IGNOREDEPENDENCIES Bypass checks for downloadable prerequisites.
/INSTALLDIR <Install path> Specify the full path to product install location.
/COMPANYNAME <companyname> Sets the company name.
/USERNAME <User name> Sets the user name.
/ADDLOCAL ALL Install all features.
/REMOVE ALL Remove all features.
/REPAIR ALL Repair installation.
/CABPATH <cabfile> Specify a local path to a redistributable CAB file.
/CEIP Opt in to BizTalk Server Customer Experience Improvement Program.

These commands correspond to those listed on the silent installation page for BizTalk 2013 mentioned above with the exception that the final two commands listed on the web page appear to be missing from the above list generated by BizTalk Server 2016.

The /AddLocal command-line parameter details the features that will be installed. On the silent installation web page, there is a link to follow to the list of features (at http://go.microsoft.com/fwlink/p/?LinkID=189319), however if you browse to that page, you’ll notice that it is marked as the features for BizTalk Server 2010. There are issues using some of the parameters for the installation of BizTalk Server 2016, so it seemed worthwhile attempting to enumerate the parameters that are available to a BizTalk Server 2016 installation.

The installation MSI for BizTalk Server 2016 can be opened using Orca (Orca.exe is a database table editor for creating and editing Windows Installer packages and merge modules – see https://msdn.microsoft.com/en-us/library/windows/desktop/aa370557(v=vs.85).aspx for acquisition and installation instructions) to get the list of features. The screen shot below shows a partial view of the ‘Feature’ table from the ‘Microsoft BizTalk Server64.msi’ file:

Orca Features Table for BizTalk Server 2016 MSI

The information in the ‘Feature’ table, along with information gleaned by running the installer, ticking specific single components and then examining the setup log file can be reorganised to give the following:

Feature AddLocal Command
Portal Components BizTalk, WMI, InfoWorkerApps
      Business Activity Monitoring BAMPortal
Developer Tools and SDK BizTalk, WMI, AdapterImportWizard, BizTalkExplorer, BizTalkExtensions, DeploymentWizard, Designer, Development, Migration, MsEDIMigration, MsEDISchemaExtension, MsEDISDK, OrchestrationDesigner, PipelineDesigner, SDK, TrackingProfileEditor, VSTools, WCFDevTools, XMLTools
Documentation Documentation
Server Runtime BizTalk, WMI, Engine, MOT, MSMQ, Runtime
      BizTalk EDI/AS2 Runtime MsEDIAS2, MsEDIAS2StatusReporting
      Windows Communication Foundation Adapter WCFAdapter
Administration Tools and Monitoring BizTalk, WMI, AdminAndMonitoring, AdminTools, BAMTools, BizTalkAdminSnapIn, HealthActivityClient, MonitoringAndTracking, PAM
      Windows Communication Foundation Administration Tools WcfAdapterAdminTools
Additional Software BizTalk, WMI, AdditionalApps
      Enterprise Single Sign-On Administration Module SSOAdmin
      Enterprise Single Sign-On Master Secret Server SSOServer
      Business Rules Components RulesEngine
      MQSeries Agent MQSeriesAgent
      BAM Alert Provider OLAPNS
      BAM CLient FBAMCLIENT
      BAM-Eventing BAMEVENTAPI
      Project Build Component ProjectBuildComponent

Notes:

  • The ‘BizTalk’ and ‘WMI’ feature are specified in numerous places in the table above. You only need to specify each of these items once.
  • The parameters are case sensitive. Specifying a parameter incorrectly, e.g. OlapNs rather than OLAPNS will result in a silent installation failure.
  • When adding the parameters to the command line, it is important that there is no space between the items. Including a space (e.g. ‘BizTalk, WMI, AdditionalApps’ rather than ‘BizTalk,WMI,AdditionalApps’) will result in a silent installation failure.
  • One of the features, ‘SDKScenarios’, is never mentioned in the setup log file. It is assumed that this feature is automatically installed if required by the parent feature (SDK), however including it within the AddLocal command line parameter list doesn’t seem to cause any issues.

Deploying Visual Studio 2017 Using Configuration Manager

Previous versions of Visual Studio were typically delivered via ISO files that we could import into Configuration Manager for deployment to workstations. Visual Studio 2017 arrives as a web installer only (although you can create installation media using the –layout option from the command line if you still want to go down that route).

The command-line parameters of the Visual Studio 2017 installer are also different to previous versions as well, requiring a different approach. See https://docs.microsoft.com/en-us/visualstudio/install/use-command-line-parameters-to-install-visual-studio for information on the available command-line parameters.

In the past I’ve tried using an AdminDeployment.xml file to control which components of Visual Studio are installed. With Visual Studio 2013 this worked fine for me. With Visual Studio 2015 I could not make this approach work at all, and ended up specifying the components to be installed by using the ‘/InstallSelectableItems’ command-line parameter, which worked a treat.

Visual Studio uses this latter approach to selecting the components that will be installed with the product, but the system has been extended to provide more control over the component installation, with an ‘IncludeRecommended’ and ‘IncludeOptional’ flag available for each component, or globally, as required. A list of the Visual Studio 2017 workload and component IDs can be found at https://docs.microsoft.com/en-us/visualstudio/install/workload-and-component-ids (click through to the product you’re installing, for us this was Visual Studio Enterprise 2017, workload and component IDs for which are found at https://docs.microsoft.com/en-us/visualstudio/install/workload-component-id-vs-enterprise)

For example, to add the Azure development workload, with all optional and recommended components, you’d add the following to the command-line that you issue to the installer:

--add Microsoft.VisualStudio.Workload.Azure;includeOptional;includeRecommended

As you can see, this means that the command-line has the potential to get long very quickly!

For the workloads and components I was asked to deploy with Visual Studio Enterprise 2017, our command-line became

mu_visual_studio_enterprise_2017_x86_x64_10049783.exe --add Microsoft.VisualStudio.Workload.Azure;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.Data;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.ManagedDesktop;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.ManagedGame;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.NativeCrossPlat --add Microsoft.VisualStudio.Workload.NativeDesktop --add Microsoft.VisualStudio.Workload.NativeGame --add Microsoft.VisualStudio.Workload.NativeMobile --add Microsoft.VisualStudio.Workload.NetCoreTools;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.NetCrossPlat;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.NetWeb;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.Node;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.Office;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.Universal;includeOptional;includeRecommended --add Microsoft.VisualStudio.Workload.VisualStudioExtension --add Microsoft.VisualStudio.Workload.WebCrossPlat;includeOptional;includeRecommended --add Component.GitHub.VisualStudio --add Microsoft.Component.Blend.SDK.WP --add Microsoft.Component.HelpViewer --add Microsoft.Net.Component.3.5.DeveloperTools --add Microsoft.VisualStudio.Component.LinqToSql --add Microsoft.VisualStudio.Component.TestTools.CodedUITest --add Microsoft.VisualStudio.Component.TestTools.Core --add Microsoft.VisualStudio.Component.TestTools.FeedbackClient --add Microsoft.VisualStudio.Component.TestTools.MicrosoftTestManager --add Microsoft.VisualStudio.Component.TestTools.WebLoadTest --add Microsoft.VisualStudio.Component.TypeScript.2.0 --quiet --norestart --wait

Which, frankly, is huge!

The length of the command-line poses an immediate issue as it’s longer than that allowed in the text box for the installation program for an application. Here’s the approach I took:

  1. Once you’ve determined the workloads and components that are to be installed, create a batch file containing the command line. Prefix the command line with %~dp0 (no backslash or anything; this is to run the command-line from the current directory).
  2. (Optional) Create a batch file to uninstall Visual Studio 2017. My batch file contains the following command:
    %~dp0mu_visual_studio_enterprise_2017_x86_x64_10049783.exe uninstall --quiet –wait
  3. Copy the two batch files created, along with the web installer to a suitable location on the Configuration Manager server, the configure the application as follows:
    1. Create a new application and select ‘manually specify the application information’.
    2. Specify the name for the application, publisher, version and any other information required by your organisation:
      General Application Settings
    3. Specify the appearance of the application in the Application Catalog. Specify the icon by browsing to the web installer and selecting this. One icon is available:
      Application Icon
    4. On the ‘Deployment Type’ page of the wizard, click ‘Add’ and again specify ‘manually specify the deployment type information’.
    5. Provide a name for the deployment type, e.g. ‘Visual Studio Enterprise 2017’ and any required comments.
    6. Specify the content location. This should be the network path where the web installer and two batch files are located, e.g. ‘\\SCCM\Applications\VisualStudioEnterprise2017’.
    7. For the installation program, specify the name (and extension) of the installation batch file you created earlier.
    8. For the uninstall program, specify the uninstallation batch file you created earlier, or the following command-line if you chose not to create a batch file:
      mu_visual_studio_enterprise_2017_x86_x64_10049783.exe -uninstall --quiet –wait
      Content Location and Programs
    9. Specify the detection method that you want to use. I opted for a simple ‘devenv.exe’ version greater than or equal to ‘15.0.26228.4’ which was the version of the file deployed during testing of the installer:
      App Deployment Detection
    10. Specify the user experience settings. Our installation takes approximately 60 minutes. I chose also to allow the maximum run time to be longer than the default 2 hours.
    11. Specify any requirements for the installation. I didn’t have anything to add here.
    12. Specify any dependencies for the installation. Again I didn’t have anything to add here.
    13. Complete the creation of the application by clicking ‘Next’ at the subsequent screens.
  4. Distribute the content by right-clicking the application and selecting ‘Distribute Content’.
  5. Deploy the application and select appropriate collections to deploy it to.
  6. Test!

Note: Installation takes approximately an hour on our workstations, and fails if any other Visual Studio product is running on the workstation during the installation process.

Offline Domain Join with Direct Access

I was recently in the position that I needed to rebuild a workstation at a remote location, but wanted to end up with it joined to the domain, and able to install software via the SCCM Software Center. Enter Offline Domain Join (djoin.exe)!

Offline Domain Join allows the creation of a machine account and the establishment of a trust relationship between a computer running Windows and a Domain. As part of the process, group policy information can also be transferred to the machine that will be joined to the domain.

Assuming Direct Access is available, the appropriate group policy information for Direct Access can be transferred as part of the process, and this should then allow the remote machine to establish a connection to the domain and from there all remaining group policy information can be transferred, the Configuration Manager client installed etc.

Information on ‘djoin.exe’ including examples for use can be found at https://technet.microsoft.com/en-us/library/offline-domain-join-djoin-step-by-step

My scenario was:

  • The machine account already existed in the correct OU and was a member of the appropriate groups for Direct Access (the machine name had already been used; this was a rebuild) and therefore I needed to use the ‘/reuse’ parameter.
  • The only group policy information I wanted to transfer to the remote machine was for Direct Access. I anticipated that all other group policy information would be transferred automatically once a Direct Access connection had been established.

In my case, the command I used on the provisioning server were:

djoin /provision /domain domain.com /machine MyWorkstation /savefile MyWorkstation-blob.txt /reuse /policynames “Direct Access Client”

The resultant blob should be transferred securely – take note of what the TechNet page says on the matter:

The base64-encoded metadata blob that is created by the provisioning command contains very sensitive data. It should be treated just as securely as a plaintext password. The blob contains the machine account password and other information about the domain, including the domain name, the name of a domain controller, the security ID (SID) of the domain, and so on. If the blob is being transported physically or over the network, care must be taken to transport it securely.

On the remote workstation, the command I used was:

djoin /requestODJ /loadfile MyWorkstation-blob.txt /windowspath %SystemRoot% /localos

At this point you’re prompted to reboot the workstation. Once the reboot was complete, I left the machine for a few minutes to allow it to establish a connection, then signed in. Everything worked as anticipated and I could log in as a domain user and a Direct Access connection was established. Following a group policy update, the Configuration Manager client was transferred and installed, and a short time later the Software Center became available and I could add software made available from SCCM.

DPM Protection for Windows 10 Anniversary Edition

Attempting to add protection to a Windows 10 Anniversary Edition workstation recently failed with the DPM server showing the workstation as ‘unavailable’ when looking at the ‘Production Servers’ list in the console.

It appears that the upgrade to Anniversary Edition removes a file that the DPM agent relies on, ‘sisbkup.dll’, and that as a consequence the services cannot start on the protected workstation.

The resolution is to copy the ‘sisbkup.dll’ file from c:\Windows\System32 on an older version of Windows 10 into C:\Windows\System32 on the Anniversary Update machine and then retry the connection from DPM.

Web Application Proxy Failure Following Outage

Following a ‘hiccup’, involving a Web Application Proxy (WAP) server, internal services were no longer being published to the outside world.

After some investigation, both the ADFS and WAP services showed as stopped on the server. Attempting to start the ADFS service from the services console produced the following error:

Windows could not start the Active Directory Federation Service service on Local Computer.
Error 1064: An exception occurred in the service when handling the control request.

Under the System section of the Windows Event Log, the following error was shown:

Event ID: 7023
The Active Directory Federation Services service terminated with the following error:
An exception occurred in the service when handling the control request.

Followed a few moments later by the following error:

Event ID: 7023
The Web Application Proxy Service terminated with the following error:
A certificate is required to complete client authentication

Looking in the ‘AD FS’ section of the Event Log (under ‘Applications and Services Logs’), the following errors were shown (note that the first error was generally shown multiple times, followed by a single instance of the second error):

Event ID: 383
The Web request failed because the web.config is malformed.
User Action:
Fix the malformed data in the web.config file.
Exception details:
Root element is missing (C:\Windows\ADFS\Config\microsoft.identityServer.proxyservice.exe.config)
Root element is missing.

Followed by:

Event ID: 199
The federation server proxy could not be started.
Reason: Error retrieving proxy configuration from the Federation Service.
Additional Data
Exception details:
An error occurred when attempting to load the proxy configuration.

Checking the file at C:\Windows\ADFS\Config\microsoft.identityServer.proxyservice.exe.config showed that while the file size was still indicated as 2k, the file was blank.

I’ve seen a number of reports online indicating that WAP seems happy to chew up the contents of this configuration file following an outage, although I can find no information on why this might happen. If you have a backup of the file in question, it should be a simple matter to restore this file and restart the ADFS and WAP services to restore service. If you don’t, and have no other example server from which you can pull a similar copy of the file then the following steps must be taken:

  1. Remove the Web Application Proxy role from the server. Once this is complete, a reboot will be required.
  2. Re-add the Web Application Proxy role to the server.
  3. Once this is complete, initiate the configuration wizard.
  4. Use the same configuration parameters as you used when configuring the service initially, namely federation service name (e.g. federation.domain.com), local admin details for the federation server and the federation certificate (unless you’ve replaced the certificate used, in which case obviously you should use the new certificate details); you noted those down during initial configuration, right?
  5. Once configuration is complete, the Remote Access Management Console should open automatically. All of your publishing rules should still be in place, and your published services should be available immediately.

For reference, here’s a sample config file, from which you should be able to reconstruct an appropriate file for your service:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <configSections>
    <section name="microsoft.identityServer.proxyservice" type="Microsoft.IdentityServer.Management.Proxy.Configuration.ProxyConfiguration, Microsoft.IdentityServer.Management.Proxy, Version=6.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL" />
  </configSections>

  <microsoft.identityServer.proxyservice>
    <congestionControl latencyThresholdInMSec="8000" minCongestionWindowSize="64"
      enabled="true" connectionTimeoutInSec="60" />
    <connectionPool connectionPoolSize="200" scavengeInterval="5" />
    <diagnostics eventLogLevel="15" />
    <host tlsClientPort="49443" httpPort="80" httpsPort="443" name="federation.domain.com" />
    <proxy address="" />
    <trust thumbprint="1234567890ABCDEF1234567890ABCDEF12345678"
      proxyTrustRenewPeriod="21600" />
  </microsoft.identityServer.proxyservice>
  <!-- <system.serviceModel>
    <diagnostics>
      <messageLogging logEntireMessage="true"
              logMessagesAtServiceLevel="true"
              logMessagesAtTransportLevel="true">
      </messageLogging>
    </diagnostics>
  </system.serviceModel> -->
</configuration>

 

Web Application Proxy Failure Following Outage

Following a ‘hiccup’, involving a Web Application Proxy (WAP) server, internal services were no longer being published to the outside world.

After some investigation, both the ADFS and WAP services showed as stopped on the server. Attempting to start the ADFS service from the services console produced the following error:

Windows could not start the Active Directory Federation Service service on Local Computer.
Error 1064: An exception occurred in the service when handling the control request.

Under the System section of the Windows Event Log, the following error was shown:

Event ID: 7023
The Active Directory Federation Services service terminated with the following error:
An exception occurred in the service when handling the control request.

Followed a few moments later by the following error:

Event ID: 7023
The Web Application Proxy Service terminated with the following error:
A certificate is required to complete client authentication

Looking in the ‘AD FS’ section of the Event Log (under ‘Applications and Services Logs’), the following errors were shown (note that the first error was generally shown multiple times, followed by a single instance of the second error):

Event ID: 383
The Web request failed because the web.config is malformed.
User Action:
Fix the malformed data in the web.config file.
Exception details:
Root element is missing (C:\Windows\ADFS\Config\microsoft.identityServer.proxyservice.exe.config)
Root element is missing.

Followed by:

Event ID: 199
The federation server proxy could not be started.
Reason: Error retrieving proxy configuration from the Federation Service.
Additional Data
Exception details:
An error occurred when attempting to load the proxy configuration.

Checking the file at C:\Windows\ADFS\Config\microsoft.identityServer.proxyservice.exe.config showed that while the file size was still indicated as 2k, the file was blank.

I’ve seen a number of reports online indicating that WAP seems happy to chew up the contents of this configuration file following an outage, although I can find no information on why this might happen. If you have a backup of the file in question, it should be a simple matter to restore this file and restart the ADFS and WAP services to restore service. If you don’t, and have no other example server from which you can pull a similar copy of the file then the following steps must be taken:

  1. Remove the Web Application Proxy role from the server. Once this is complete, a reboot will be required.
  2. Re-add the Web Application Proxy role to the server.
  3. Once this is complete, initiate the configuration wizard.
  4. Use the same configuration parameters as you used when configuring the service initially, namely federation service name (e.g. federation.domain.com), local admin details for the federation server and the federation certificate (unless you’ve replaced the certificate used, in which case obviously you should use the new certificate details); you noted those down during initial configuration, right?
  5. Once configuration is complete, the Remote Access Management Console should open automatically. All of your publishing rules should still be in place, and your published services should be available immediately.

For reference, here’s a sample config file, from which you should be able to reconstruct an appropriate file for your service:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <configSections>
    <section name="microsoft.identityServer.proxyservice" type="Microsoft.IdentityServer.Management.Proxy.Configuration.ProxyConfiguration, Microsoft.IdentityServer.Management.Proxy, Version=6.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL" />
  </configSections>

  <microsoft.identityServer.proxyservice>
    <congestionControl latencyThresholdInMSec="8000" minCongestionWindowSize="64"
      enabled="true" connectionTimeoutInSec="60" />
    <connectionPool connectionPoolSize="200" scavengeInterval="5" />
    <diagnostics eventLogLevel="15" />
    <host tlsClientPort="49443" httpPort="80" httpsPort="443" name="federation.domain.com" />
    <proxy address="" />
    <trust thumbprint="1234567890ABCDEF1234567890ABCDEF12345678"
      proxyTrustRenewPeriod="21600" />
  </microsoft.identityServer.proxyservice>
  <!-- <system.serviceModel>
    <diagnostics>
      <messageLogging logEntireMessage="true"
              logMessagesAtServiceLevel="true"
              logMessagesAtTransportLevel="true">
      </messageLogging>
    </diagnostics>
  </system.serviceModel> -->
</configuration>