When software attacks!

Thoughts and musings on anything that comes to mind

Getting ready for Global Windows Azure Bootcamp 2

It’s a busy week. I’m speaking at the Black Marble-hosted GWAB2 event this Saturday, along with Steve Spencer and Andy Westgarth. Richard and Robert will also be on hand which means between us we should be able to cover questions on much of the newly re-monikered Microsoft Azure.

I’ll be running through IaaS, Azure AD and looking at hybrid cloud solutions from an IT perspective while Steve and Andy talk through the other platform services from a developer point of view.

Migrating to SCVMM 2012 R2 in a TFS Lab Scenario

Last week I moved our SCVMM from 2012 with service pack 1 to 2012 R2. Whilst the actual process was much simpler than I expected, we had a pretty big constraint imposed upon us by Lab Manager that largely dictated our approach.

Our SCVMM 2012 deployment was running on an aging Dell server. It had a pair of large hard drives that were software mirrored by the OS an we were using NIC teaming in Server 2012 to improve network throughput. It wasn’t performing that well, however. Transfers from the VMM library hosted on the server to our VM hosts were limited by the speed of the ageing SATA connectors and incoming transfers were further slowed by the software mirroring. We also had issues where Lab manager would timeout jobs whilst SCVMM was still diligently working on them.

Our grand plan involves migrating our VM hosts to Server 2012 R2. That will give us better network transfers of VMs and allow generation 2 VMs on our production servers (also managed by SCVMM). To get there we needed to upgrade SCVMM, and to do that we had to upgrade our Team Foundation Server. Richard did the latter a little while ago, which triggered the process of SCVMM upgrade.

Our big problem was that Lab is connected extremely strongly to SCVMM. We discovered just how strongly when we moved the SCVMM 2012. If we changed the name of the SCVMM server we would have to disconnect Lab from SCVMM. That would mean throwing away all our environments and imported machines, and I’m not going through the pain of rebuilding all that lot ever again.

I desperately wanted to move SCVMM onto better tin – more RAM, more cores and, importantly, faster disks and hardware mirroring. That led to a migration process that involved the following steps:

  1. Install Server 2012 R2 on our new server. Configure storage to give an OS drive and a data drive for the SCVMM library.
  2. Install the SCVMM pre-requisites on the new server.
  3. Using robocopy, transfer the contents of the SCVMM library to the new server. This needed breaking into blocks as we use data deduplication, and our library share contents are about three times the size of the drive! We could repeat the robocopy script and it would transfer any updated files.
  4. Uninstall SCVMM 2012 from the old server, making sure to keep the database as we do so.
  5. Change the name of the old server, and it’s IP address.
  6. Change the name of the new server to that of the old one, and change the IP address.
  7. Install SCVMM 2012 R2 onto the new server.

Almost all of that worked perfectly. When installing SCVMM onto the new server I wanted to use an existing share for the library, sat on drive d: and called MSCVMMLibrary. Setup refused, saying that the server I was installing to already had a share of that name, but on drive c:. Very true – for various reasons the share was indeed on the c: drive, albeit with storage on a separate partition attached with a mount point.

What to do – I couldn’t remove the existing share as I didn’t have SCVMM installed. I didn’t want to roll back either, as the steps were painful enough to deter me. So I looked in the SCVMM database for the share.

Sure enough, there is a table in there that lists the paths for the library shares for each server (tbl_IL_LibraryShare). There was a row with the name of my SCVMM server and a c:\mscvmmlibrary path for the share. I changed the ‘c’ to a ‘d’ and reran setup. It worked like a charm.

Now, I would not recommend doing what I did, but in the Lab Manager scenario, removing and re-adding that share causes all kinds of trouble as the resources in the library are connected to lab environments. I haven’t had any problems post-upgrade, so it looks like I got away with it. Sadly, this is just another in a long list of issues with the way Lab Manager interacts with SCVMM.

Creating Azure Virtual Networks using Powershell and XML Part 4: Local networks and site-site connectivity

This is part 4 of a series of posts building powershell functions to create and modify Azure Virtual Networks. Previous posts have covered functions to create virtual networks and then delete them. In this part, I’m going to show you functions that will define local networks and configure site-site VPN connectivity between a local and virtual network.

Next on my list is to create functions to delete the local networks and remove the site-site connections. Then I really must look at functions to edit the configuration.

Adding the functionality for local networks also meant that I had to modify the get-azureNetworkConfig function to create the LocalNetworkSites xml node if it does not already exist, ready to hold our local network definitions.

The Functions

get-azureNetworkConfig

This is an update to the function shown in part 2.

function get-azureNetworkXml
{

$currentVNetConfig = get-AzureVNetConfig
if ($currentVNetConfig -ne $null)
{
[xml]$workingVnetConfig = $currentVNetConfig.XMLConfiguration
} else {
$workingVnetConfig = new-object xml
}

$networkConfiguration = $workingVnetConfig.GetElementsByTagName("NetworkConfiguration")
if ($networkConfiguration.count -eq 0)
{
$newNetworkConfiguration = create-newXmlNode -nodeName "NetworkConfiguration"
$newNetworkConfiguration.SetAttribute("xmlns:xsd","http://www.w3.org/2001/XMLSchema")
$newNetworkConfiguration.SetAttribute("xmlns:xsi","http://www.w3.org/2001/XMLSchema-instance")
$networkConfiguration = $workingVnetConfig.AppendChild($newNetworkConfiguration)
}

$virtualNetworkConfiguration = $networkConfiguration.GetElementsByTagName("VirtualNetworkConfiguration")
if ($virtualNetworkConfiguration.count -eq 0)
{
$newVirtualNetworkConfiguration = create-newXmlNode -nodeName "VirtualNetworkConfiguration"
$virtualNetworkConfiguration = $networkConfiguration.AppendChild($newVirtualNetworkConfiguration)
}

$dns = $virtualNetworkConfiguration.GetElementsByTagName("Dns")
if ($dns.count -eq 0)
{
$newDns = create-newXmlNode -nodeName "Dns"
$dns = $virtualNetworkConfiguration.AppendChild($newDns)
}

$localNetworks = $virtualNetworkConfiguration.GetElementsByTagName("LocalNetworkSites")
if ($localNetworks.count -eq 0)
{
$newlocalNetworks = create-newXmlNode -nodeName "LocalNetworkSites"
$localNetworks = $virtualNetworkConfiguration.AppendChild($newLocalNetworks)
}

$virtualNetworkSites = $virtualNetworkConfiguration.GetElementsByTagName("VirtualNetworkSites")
if ($virtualNetworkSites.count -eq 0)
{
$newVirtualNetworkSites = create-newXmlNode -nodeName "VirtualNetworkSites"
$virtualNetworkSites = $virtualNetworkConfiguration.AppendChild($newVirtualNetworkSites)
}

return $workingVnetConfig
}

add-azureVnetLocalNetworkSite

Add-azureVnetLocalNetworkSite takes three parameters: networkName is the name for the new local network; addressPrefix is the network prefix for the local network and vpnGatewayAddress is the ip address of the local VPN gateway that will establish the vpn tunnel. The function checks that the local network does not already exist and then creates the appropriate XML.

function add-azureVnetLocalNetworkSite
{
param
(
[string]$networkName,
[string]$addressPrefix,
[string]$vpnGatewayAddress
)

#check if the network already exists
$siteExists = $workingVnetConfig.GetElementsByTagName("LocalNetworkSite") | where {$_.name -eq $networkName}
if ($siteExists.Count -ne 0)
{
	write-Output "Local Network Site $networkName already exists"
	$newNetwork = $null
	return $newNetwork
}
 
#get the parent node
$workingNode = $workingVnetConfig.GetElementsByTagName("LocalNetworkSites")
#add the new network node
$newNetwork = create-newXmlNode -nodeName "LocalNetworkSite"
$newNetwork.SetAttribute("name",$networkName)
$network = $workingNode.appendchild($newNetwork)

#add new address space node
$newAddressSpace = create-newXmlNode -nodeName "AddressSpace"
$AddressSpace = $network.appendchild($newAddressSpace)
$newAddressPrefix = create-newXmlNode -nodeName "AddressPrefix"
$newAddressPrefix.InnerText = $addressPrefix 
$AddressSpace.appendchild($newAddressPrefix)

#add the new vpn gateway address
$newVpnGateway = create-newXmlNode -nodeName "VPNGatewayAddress"
$newVpnGateway.InnerText = $vpnGatewayAddress
$network.AppendChild($newVpnGateway)

#return our new network
$newNetwork = $network
return $newNetwork 

}

add-azureVnetSiteConnectivity

add-azureVnetSiteConnectivity takes two parameters: networkName is the name of the virtual network and localNetworkName is the name of the local network. It checks to make sure both are defined before creating the appropriate XML to define the connection. In order for the site-site VPN configuration to be applied, the virtual network must have a subnet named GatewaySubnet, so the function checks for that too. I already have a function to create subnets so I can use that to create the subnet. The function also currently specifies a type of IPSec for the connection as no other options are currently available for site-to-site vpn connections.

function add-azureVnetSiteConnectivity
{
param
(
[string]$networkName,
[string]$localNetworkName
)

#get our target network
$workingNode = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($workingNode.Count -eq 0)
{
	write-Output "Network $networkName does not exist"
	$newVnetSiteConnectivity = $null
	return $newVnetSiteConnectivity
}

#check that the network has a GatewaySubnet
$subNetExists = $workingNode.GetElementsByTagName("Subnet") | where {$_.name -eq "GatewaySubnet"}
if ($subNetExists.count -eq 0)
{
	write-Output "Virtual network $networkName has no Gateway subnet"
	$newVnetSiteConnectivity = $null
	return $newVnetSiteConnectivity
}


#check that the local network site exists
$localNetworkSite = $workingVnetConfig.GetElementsByTagName("LocalNetworkSite") | where {$_.name -eq $localNetworkName}
if ($localNetworkSite.count -eq 0)
{
	write-Output "Local Network Site $localNetworkSite does not exist"
	$newVnetSiteConnectivity = $null
	return $newVnetSiteConnectivity
}

#check if the gateway node exists and if not, create
$gateway = $workingNode.GetElementsByTagName("Gateway")
if ($gateway.count -eq 0)
{
$newGateway = create-newXmlNode -nodeName "Gateway"
$gateway = $workingNode.appendchild($newGateway)
}

#check if the ConnectionsToLocalNetwork node exists and if not, create
$connections = $workingNode.GetElementsByTagName("ConnectionsToLocalNetwork")
if ($connections.count -eq 0)
{
$newConnections = create-newXmlNode -nodeName "ConnectionsToLocalNetwork"
$connections = $gateway.appendchild($newConnections)
}


#check to make sure our local site reference doesn't already exist
$localSiteRefExists = $workingNode.GetElementsByTagName("LocalNetworkSiteRef") | where {$_.name -eq $localNetworkName}
if ($localSiteRefExists.count -ne 0)
{
	write-Output "Local Site Ref $localNetworkName already exists"
	$newVnetSiteConnectivity = $null
	return $newVnetSiteConnectivity 
}

#add the local site ref
$newVnetSiteConnectivity = create-newXmlNode -nodeName "LocalNetworkSiteRef"
$newVnetSiteConnectivity.SetAttribute("name",$localNetworkName)
$vNetSiteConnectivity = $connections.appendchild($newVnetSiteConnectivity)
$newConnection = create-newXmlNode -nodeName "Connection"
$newConnection.SetAttribute("type","IPsec") 
$vNetSiteConnectivity.appendchild($newConnection)

#return our new subnet
$newVnetSiteConnectivity = $vNetSiteConnectivity
return $newVnetSiteConnectivity 
}

Using the functions

These functions modify an XML configuration that needs to be held in an object named $workingVnetConfig. Part 2 of this series showed how they can be loaded from a powershell file and called. Get-azureNetworkXml is required to get the XML configuration object. The functions here can then be used to remove items from that configuration, then save-azureNetworkXml will push the modified configuration back into Azure.

Gary Lapointe to the rescue: Using his Office 365 powershell tools to recover from a corrupted masterpage

I also need to give credit to the Office 365 support team over this. They were very quick in their response to my support incident, but I was quicker!

Whilst working on an Office 365 site for a customer today I had a moment of blind panic. The site is using custom branding and I was uploading a new version of the master page to the site when things went badly wrong. The upload appeared to finish OK but the dialog that was shown post upload was not the usual content type/fill in the fields form, but a plain white box. I left it for a few minutes but nothing changed. Unperturbed, I returned to the mater page gallery… Except I couldn’t. All I got was a white page. No errors, nothing. No pages worked at all – no settings pages, no content pages, nothing at all.

After some screaming, I tried SharePoint designer. Unfortunately, this was disabled (it is by default) and I couldn’t reach the settings page to enable it. I logged a support call and then suddenly remembered a recent post from Gary Lapointe about a release of some powershell tools for Office 365.

Those tools saved my life. I connected to the Office 365 system with :

Connect-SPOSite -Credential "<my O365 username>" -url "<my sharepoint online url>"

Success!

First of all I used set-spoweb to set the masterurl and custommasterurl properties of the failed site. That allow me back into the system (phew!):

Set-SPOWeb -Identity "/" -CustomMasterUrl "/_catalogs/masterpage/seattle.master"

Once in, I thought all was well, but I could only access content pages. Every time I tried to access the masterpages libary or one of the site settings pages I got an error, even using Seattle.master.

Fortunately, Gary also has a command that will upload a file to a library, so I attempted to overwrite my corrupted masterpage:

New-SPOFile -List "https://<my sharepoint online>.sharepoint.com/_catalogs/masterpage" -Web "/" -File "<local path to my master page file>" –Overwrite

Once I’d done that, everthing snapped back into life.

The moral of the story? Keep calm and always have PowerShell ISE open!

You can download Gary’s tools here and instructions on their use are here.

Big thanks, Gary!

Creating Azure Virtual Networks using Powershell and XML Part 3: Powershell functions for deletion

This is part three of a series of posts about using powershell to script the creation, deletion and (hopefully) modification of Azure Virtual Networks. In part 1 I went through the key steps with some rough code. Part 2 showed the much tidier functions I’ve now written to create virtual network elements. This is part 3, and I will present functions to remove elements. Hopefully I will manage to get the modification functions to work which be a fourth installment!

I’m not going to go through how to use the new functions in this part – I covered that before. I’m simply going to present the new functions that perform the following actions:

  1. Remove an entire virtual network definition.
  2. Remove a DNS definition.
  3. Remove a single subnet from a virtual network.
  4. Remove a DNS registration from a virtual network.

The big thing I learned when writing this code is that if I used the RemoveAll method on an xml node in my configuration xml object, it didn’t actually remove the node itself but only the attributes and child nodes. This left empty elements (such as <VirtualNetworkSite />) that confused Azure. The solution was to call the RemoveChild method on the parent node of the one I wanted rid of, specifying my target node.

The Functions

Delete-azureVnetNetwork

Delete-azureVnetNetwork takes one parameter: networkName. It makes sure the network exists, then removes the appropriate VirtualNetworkSite node and all it’s children.

function delete-azureVnetNetwork
{
param
(
[string]$networkName
)

#check that the network already exists
$network = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($network.Count -eq 0)
{
	write-Output "Network $networkName does not exist"
	$removeNetwork = $null
	return $removeNetwork
}

#remove the node and children
$network.ParentNode.RemoveChild($network) 

#return true as we deleted the node
$removeNetwork = $true 
return $removeNetwork
}

Delete-azureVnetSubnet

Delete-azureVnetSubnet takes two parameters: networkName and subnetName. It checks to make sure both exist, then removes the appropriate Subnet element from the specified network.

function delete-azureVnetSubnet
{
param
(
[string]$networkName,
[string]$subnetName
)

#check that the network  exists
$network = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($network.Count -eq 0)
{
	write-Output "Network $networkName does not exist"
	$removeSubnet = $null
	return $removeSubnet
}

#check to make sure our subnet name exists
$subNet = $network.GetElementsByTagName("Subnet") | where {$_.name -eq $subnetName}
if ($subNet.count -eq 0)
{
	write-Output "Subnet $subnetName does not exist in network"
	$removeSubnet = $null
	return $removeSubnet 
}

#remove the node and children
$subNet.ParentNode.RemoveChild($subNet)

#return true as we deleted the node
$removeSubnet = $true
return $removeSubnet
}

Delete-azureVnetDnsRef

Delete-azureVnetDnsRef takes two parameters: networkName and dnsName. It checks to make sure both the network and the DNS reference within it exist, then removes the appropriate DnsServerRef element from the specified network.

function delete-azureVnetDnsRef
{
param
(
[string]$networkName,
[string]$dnsName
)

#check that the network  exists
$network = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($network.Count -eq 0)
{
	write-Output "Network $networkName does not exist"
	$removeDnsRef = $null
	return $removeDnsRef
}

#check that the dns reference is there
$dnsRef = $network.GetElementsByTagName("DnsServerRef") | where {$_.name -eq $dnsName}
if ($dnsRef.count -eq 0)
{
	write-Output "DNS reference $dnsName does not exist"
	$removeDnsRef = $null
	return $removeDnsRef
}

#remove the node and children
$dnsRef.ParentNode.RemoveChild($dnsRef) 

#return true as we deleted the node
$removeDnsRef = $true 
return $removeDnsRef
}

Delete-azureVnetDns

Delete-azureVnetDnsRef takes one parameter: dnsName. It checks to make sure that the DNS is not referenced by any virtual networks and that the DNS exists, then removes the appropriate DnsServer element.

function delete-azureVnetDns
{
param
(
[string]$dnsName
)

#check that the dns isn't referenced in any networks
$dnsRef = $workingVnetConfig.GetElementsByTagName("DnsServerRef") | where {$_.name -eq $dnsName}
if ($dnsRef.count -ne 0)
{
	write-Output "DNS $dnsName is referenced in networks"
	$removeDns = $null
	return $removeDnsRef
}
#check that the DNS exists
$dns = $workingVnetConfig.GetElementsByTagName("DnsServer") | where {$_.name -eq $dnsName}
if ($dns.Count -eq 0)
{
	write-Output "DNS Server $dnsName does not exists"
	$removeDns = $null
	return $removeDns
}

#remove the node and children
$dns.ParentNode.RemoveChild($dns) 

#return true as we deleted the node
$removeDns = $true 
return $removeDns
}

Using the functions

These functions modify an XML configuration that needs to be held in an object call $workingVnetConfig. My previous post showed how they can be loaded from a powershell file and called. Get-azureNetworkXml is required to get the XML configuration object. The functions here can then be used to remove items from that configuration, then save-azureNetworkXml will push the modified configuration back into Azure.

Creating Azure Virtual Networks using Powershell and XML Part 2: Powershell functions

In my previous post I talked about what was involved in creating an Azure network configuration using Powershell. In this post I’ll cover where I’ve got so so far, which is a series of functions that do the following:

  1. Contact Azure and get the current network configuration. Convert that to sensible XML and if it’s empty, create the basic structure.
  2. Create a new virtual network, checking to see if one with the same name already exists.
  3. Add a subnet to a virtual network, checking to see one with the same address prefix or name doesn’t already exist.
  4. Add a DNS reference to a virtual network, making sure the DNS is defined first.
  5. Create a DNS.
  6. Put the configuration back into Azure to be applied.

Still on my to-do list are removing networks and other elements, and modifying existing networks.

The Function Code

The end result so far is a powershell script that can be loaded to give a number of new functions:

Get-azureNetworkXml

get-azureNetworkXml runs the get-AzureVNetConfig command. It takes the XMLConfiguration from that command and puts it into a new XML object. If there is no configuration, it creates a new xml object. It then checks to see if the main XML elements are present and, if not, creates them.

Whilst this function returns an object, I need to make sure (right now) that the variable nme I use for that is $workingVnetConfig as other functions reference it. I’m not currently passing the XML object into each function. I probably should, but that tidying comes later.

function get-azureNetworkXml
{

$currentVNetConfig = get-AzureVNetConfig
if ($currentVNetConfig -ne $null)
{
[xml]$workingVnetConfig = $currentVNetConfig.XMLConfiguration
} else {
$workingVnetConfig = new-object xml
}

$networkConfiguration = $workingVnetConfig.GetElementsByTagName("NetworkConfiguration")
if ($networkConfiguration.count -eq 0)
{
$newNetworkConfiguration = create-newXmlNode -nodeName "NetworkConfiguration"
$newNetworkConfiguration.SetAttribute("xmlns:xsd","http://www.w3.org/2001/XMLSchema")
$newNetworkConfiguration.SetAttribute("xmlns:xsi","http://www.w3.org/2001/XMLSchema-instance")
$networkConfiguration = $workingVnetConfig.AppendChild($newNetworkConfiguration)
}

$virtualNetworkConfiguration = $networkConfiguration.GetElementsByTagName("VirtualNetworkConfiguration")
if ($virtualNetworkConfiguration.count -eq 0)
{
$newVirtualNetworkConfiguration = create-newXmlNode -nodeName "VirtualNetworkConfiguration"
$virtualNetworkConfiguration = $networkConfiguration.AppendChild($newVirtualNetworkConfiguration)
}

$dns = $virtualNetworkConfiguration.GetElementsByTagName("Dns")
if ($dns.count -eq 0)
{
$newDns = create-newXmlNode -nodeName "Dns"
$dns = $virtualNetworkConfiguration.AppendChild($newDns)
}

$virtualNetworkSites = $virtualNetworkConfiguration.GetElementsByTagName("VirtualNetworkSites")
if ($virtualNetworkSites.count -eq 0)
{
$newVirtualNetworkSites = create-newXmlNode -nodeName "VirtualNetworkSites"
$virtualNetworkSites = $virtualNetworkConfiguration.AppendChild($newVirtualNetworkSites)
}

return $workingVnetConfig
}

Save-azureNetworkXml

Save-azureNetworkXml gets passed our XML object, writes it out to a file in the temp dir and then calls set-AzureVNetConfig to load the file and send it to Azure.

function save-azureNetworkXml($workingVnetConfig)
{
$tempFileName = $env:TEMP + "\azurevnetconfig.netcfg"
$workingVnetConfig.save($tempFileName)
notepad $tempFileName 
set-AzureVNetConfig -configurationpath $tempFileName
}

Add-azureVnetNetwork

Add-azureVnetNetwork is called with three parameters: networkName, affinityGroup and addressPrefix. It will add a new VirtualNetworkSite element, with the name and affinity group as attributes. It checks to make sure the affinity group exists first. It then creates the address prefix within the network.

function add-azureVnetNetwork
{
param
(
[string]$networkName,
[string]$affinityGroup,
[string]$addressPrefix
)

#check if the network already exists
$networkExists = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($networkExists.Count -ne 0)
{
	write-Output "Network $networkName already exists"
	$newNetwork = $null
	return $newNetwork
}
 
#check that the target affinity group exists
$affinityGroupExists = get-AzureAffinityGroup | where {$_.name -eq $affinityGroup}
if ($affinityGroupExists -eq $null)
{
	write-Output "Affinity group $affinityGroup does not exist"
	$newNetwork = $null
	return $newNetwork
}

#get the parent node
$workingNode = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSites")
#add the new network node
$newNetwork = create-newXmlNode -nodeName "VirtualNetworkSite"
$newNetwork.SetAttribute("name",$networkName)
$newNetwork.SetAttribute("AffinityGroup",$affinityGroup )
$network = $workingNode.appendchild($newNetwork)

#add new address space node
$newAddressSpace = create-newXmlNode -nodeName "AddressSpace"
$AddressSpace = $Network.appendchild($newAddressSpace)
$newAddressPrefix = create-newXmlNode -nodeName "AddressPrefix"
$newAddressPrefix.InnerText=$addressPrefix 
$AddressSpace.appendchild($newAddressPrefix)

#return our new network
$newNetwork = $network
return $newNetwork 

}

Add-azureVnetSubnet

Add-azureVnetSubnet takes three parameters: networkName, subnetName and addressPrefix. It makes sure the network exists, that the subnet doesn’t, and that the address prefix is not already used in the same network. It then adds the subnet to the network.

function add-azureVnetSubnet
{
param
(
[string]$networkName,
[string]$subnetName,
[string]$addressPrefix
)

#get our target network
$workingNode = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($workingNode.Count -eq 0)
{
	write-Output "Network $networkName does not exist"
	$newSubnet = $null
	return $newSubnet
}

#check if the subnets node exists and if not, create
$subnets = $workingNode.GetElementsByTagName("Subnets")
if ($subnets.count -eq 0)
{
$newSubnets = create-newXmlNode -nodeName "Subnets"
$subnets = $workingNode.appendchild($newSubnets)
}

#check to make sure our subnet name doesn't exist and/or prefix isn't already there
$subNetExists = $workingNode.GetElementsByTagName("Subnet") | where {$_.name -eq $subnetName}
if ($subNetExists.count -ne 0)
{
	write-Output "Subnet $subnetName already exists"
	$newSubnet = $null
	return $newSubnet 
}
$subNetExists = $workingNode.GetElementsByTagName("Subnet") | where {$_.AddressPrefix -eq $subnetName}
if ($subNetExists.count -ne 0)
{
	write-Output "Address prefix $addressPrefix already exists in another network"
	$newSubnet = $null
	return $newSubnet 
}

#add the subnet
$newSubnet = create-newXmlNode -nodeName "Subnet"
$newSubnet.SetAttribute("name",$subnetName)
$subnet = $subnets.appendchild($newSubnet)
$newAddressPrefix = create-newXmlNode -nodeName "AddressPrefix"
$newAddressPrefix.InnerText = $addressPrefix 
$subnet.appendchild($newAddressPrefix)

#return our new subnet
$newSubnet = $subnet
return $newSubnet 
}

Add-azureVnetDns

Add-azureVnetDns takes two parameters: dnsName and dnsAddress. It then creates a new DnsServer element for that DNS.

function add-azureVnetDns
{
param
(
[string]$dnsName,
[string]$dnsAddress
)

#check that the DNS does not exist
$dnsExists = $workingVnetConfig.GetElementsByTagName("DnsServer") | where {$_.name -eq $dnsName}
if ($dnsExists.Count -ne 0)
{
	write-Output "DNS Server $dnsName already exists"
	$newDns = $null
	return $newDns
}
# get our working node of Dns
$workingNode = $workingVnetConfig.GetElementsByTagName("Dns")

#check if the DnsServersRef node exists and if not, create
$dnsServers = $workingNode.GetElementsByTagName("DnsServers")
if ($dnsServers.count -eq 0)
{
$newDnsServers = create-newXmlNode -nodeName "DnsServers"
$dnsServers = $workingNode.appendchild($newDnsServers)
}

#add new dns reference
$newDnsServer = create-newXmlNode -nodeName "DnsServer"
$newDnsServer.SetAttribute("name",$dnsName)
$newDnsServer.SetAttribute("IPAddress",$dnsAddress)
$newDns = $dnsServers.appendchild($newDnsServer)

#return our new dnsRef
return $newDns 

}

Add-azureVnetDnsRef

Add-azureVnetDnsRef takes two parameters; networkName and dnsName. It makes sure the network exists and that the DNS exists before adding a DnsServerRef element for the DNS to the network.

function add-azureVnetDnsRef
{
param
(
[string]$networkName,
[string]$dnsName
)

#get our target network
$workingNode = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSite") | where {$_.name -eq $networkName}
if ($workingNode.count -eq 0)
{
	write-Output "Network $networkName does not exist"
	$newSubnet = $null
	return $newSubnet
}

#check if the DnsServersRef node exists and if not, create
$dnsServersRef = $workingNode.GetElementsByTagName("DnsServersRef")
if ($dnsServersRef.count -eq 0)
{
$newDnsServersRef = create-newXmlNode -nodeName "DnsServersRef"
$dnsServersRef = $workingNode.appendchild($newDnsServersRef)
}

#check that the DNS we want to reference is defined already
$dnsExists = $workingVnetConfig.GetElementsByTagName("DnsServer") | where {$_.name -eq $dnsName}
if ($dnsExists.Count -eq 0)
{
	write-Output "DNS Server $dnsName does not exist so cannot be referenced"
	$newDnsRef = $null
	return $newDnsRef
}

#check that the dns reference isn't already there
$dnsRefExists = $workingNode.GetElementsByTagName("DnsServerRef") | where {$_.name -eq $dnsName}
if ($dnsRefExists.count -ne 0)
{
	write-Output "DNS reference $dnsName already exists"
	$newDnsRef = $null
	return $newDnsRef
}

#add new dns reference
$newDnsServerRef = create-newXmlNode -nodeName "DnsServerRef"
$newDnsServerRef.SetAttribute("name",$dnsName)
$newDnsRef = $dnsServersRef.appendchild($newDnsServerRef)

#return our new dnsRef
return $newDnsRef 

}

Create-newXmlNode

Create-newXmlNode is called by all the other functions. It creates a new node in the XML object then hands it back to the calling function for modification and appending it to the relevant parent node.

function create-newXmlNode
{
param
(
[string]$nodeName
)

$newNode = $workingVnetConfig.CreateElement($nodeName,"http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration")
return $newNode
}

Using the functions

Assuming all our functions are in a powershell file called Create-AzureNetwork.ps1, using the new functions is pretty straightforward. We load the functions from file (there is a space between the first . and the .\ in the first line. We can then call the functions.

Note that I use a variable called $workingVnetConfig here – that’s important, as we don’t pass the XML into each function, but rather use the way powershell handles variables which means that having defined it here, it’s available to all the functions when called.

. .\Create-AzureNetwork.ps1

$workingVnetConfig = get-azurenetworkxml

add-azureVnetNetwork -networkName "Mynetwork" -affinityGroup "MyAzureAffinity" -addressPrefix "10.0.0.0/8" 
add-azureVnetSubnet -networkName "Mynetwork" -subnetName "subnet-1" -addressPrefix "10.0.0.0/11" 
add-azureVNetDns -dnsName "test1" -dnsAddress "10.0.0.1" 
add-azureVnetDnsRef -networkName "Mynetwork" -dnsName "test1"

save-azurenetworkxml($workingVnetConfig)

It’s not the most elegant of code, I’ll admit, but it does what it says on the tin. All I have to do now is add functions to remove items from our network configuration, then add functions to modify existing items.

Creating Azure Virtual Networks using Powershell and XML

I’ll be honest, I expected this task to be easier than it is. What I’m working on is some powershell that we might use as part of automated build processes that will create a new Virtual Network in an Azure subscription. What I’m after is to add a new network to the existing configuration.

There aren’t many powershell commands for Azure virtual networks. The two we need to use are get-azureVnetConfig and set-azureVnetConfig.

Get-azureVnetConfig when run generates xml that details the configuration of all virtual networks within the current Azure subscriptions. Set-azureVnetConfig takes an xml configuration and modifies the entire virtual networking configuration to match that described in the file.

My original plan of simple powershell to add a new virtual network went quickly out of the window, then. My second thought was to grab the xml configuration, manipulate it using powershell, then stuff it back into Azure. That plan was hindered by the fact that the set-azureVnetConfig command insists on reading the configuration from a file on disk, so I can’t just hand it my XML object, created by manipulating the output of get-azureVnetConfig.

I’m still working on this – I now have a script with some tidy functions to do repetitive tasks. This post is simply going to outline the first bit of heavy lifting I’ve had to do in order to solve enough problems that I can get a config, add stuff to it and reload it into Azure.

The steps below don’t create all the configuration we will want, but it creates all the configuration we need to add a new network.

1. Get the current Azure Config

This bit is easy:

$currentVNetConfig = get-AzureVNetConfig

That gives us an object which contains the XML configuration. We need to get just the XML out, so:

[xml]$workingVnetConfig = $currentVNetConfig.XMLConfiguration

2. Find the VirtualNetworkSites element

The networks I want to create are all held in the VirtualNetworkSites element, each one in a VirtualNetworkSite element. I can create new VirtualNetworkSite elements, but I need to grab the element in which to create them first:

$virtNetCfg = $workingVnetConfig.GetElementsByTagName("VirtualNetworkSites")

3. Add a new Virtual Network

To add a new network we need to add a new VirtualNetworkSite element. I hit a snag with this, in that I kept getting a spurious xmlns attribute on the element that caused set-azureVnetConfig to spit out the file as invalid. It turns out that in order to avoid this, we have to specify the XML namespace URI when we create the new element. That’s the second parameter on the CreateElement method, below.

Creating the element itself is a two-stage process: First we create a new element inside our XML object, then we put that element in the right place by calling appendchild on the intended parent element. In addition, we need to add a couple of attributes to that element, specifying the name of the network and the affinity group it will sit in:

$newNetwork = $workingVnetConfig.CreateElement("VirtualNetworkSite","http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration") 
$newNetwork.SetAttribute("name","myVirtualNetwork") 
$newNetwork.SetAttribute("AffinityGroup","MyAffinityGroup") 
$Network = $virtNetCfg.appendchild($newNetwork)

4. Add an address space

This is a similar process. I need an AddressSpace element and within that sits an AddressPrefix element. That element needs text that tells Azure the IP address space to use, and that’s added by setting the innerText property.

$newAddressSpace = $workingVnetConfig.CreateElement("AddressSpace","http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration") 
$AddressSpace = $Network.appendchild($newAddressSpace) 
$newAddressPrefix = $workingVnetConfig.CreateElement("AddressPrefix","http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration") 
$newAddressPrefix.InnerText="10.0.0.0/8" 
$AddressSpace.appendchild($newAddressPrefix)

5. Add a subnet

Virtual networks need subnets. There is a Subnets element that contains multiple Subnet elements, each of which has an AddressPrefix element.

$newSubnets = $workingVnetConfig.CreateElement("Subnets","http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration") 
$Subnets = $Network.appendchild($newSubnets) 
$newSubnet = $workingVnetConfig.CreateElement("Subnet","http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration") 
$newSubnet.SetAttribute("name","Subnet-1") 
$Subnet = $Subnets.appendchild($newSubnet) 
$newAddressPrefix = $workingVnetConfig.CreateElement("AddressPrefix","http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration") 
$newAddressPrefix.InnerText="10.0.0.0/11" 
$Subnet.appendchild($newAddressPrefix)

6. Write out to a file and then use that file

$tempFileName = $env:TEMP + "\azurevnetconfig.netcfg" 
$workingVnetConfig.save($tempFileName) 

set-AzureVNetConfig -configurationpath $tempFileName

Next Steps

If you don’t have any Azure networks defined then get-azureVnetConfig will give you nothing. That means that more XML needs to be generated for a new network configuration. I’m working on a more expansive script right now and I’ll post that when I get something meaningful to show.

Declaratively create Composed Looks in SharePoint 2013 with elements.xml

This is really a follow-up to my earlier post about tips with SharePoint publishing customisations. Composed looks have been a part of a couple of projects recently. In the first, a solution for on-premise, we used code in a feature receiver to add a number of items to the Composed Looks list. In the second, for Office 365, a bit of research offered an alternative approach with no code.

What are Composed Looks

A composed look is a collection of master page, colour scheme file, font scheme file and background image. There is a site list called Composed Looks that holds them, and they are shown in the Change the Look page as the thumbnail options you can choose to apply branding in one hit.

In order to get your new composed look working there are a few gotchas you need to know:

  1. When you specify a master page in your composed look, there must be a valid .preview file with the same name. This file defines the thumbnail image – if you look at an existing file (such as seattle.preview or olso.preview) you will find html and styling rules, along with some clever token replacement that references colours in the color scheme file.
  2. A composed look must have a master page and colour scheme (.spcolor) file, but font scheme and background image are optional.
  3. When using sites and site collections, files are split between local and root gallery locations:
    1. The Composed look List is local to the site – it doesn’t inherit from the parent site.
    2. Master pages go in the site Master Page Gallery.
    3. Spcolor, sptheme and image files go in the site collection master page gallery.

If any of the files you specify in your composed look don’t exist (or you get the url wrong), the thumbnail won’t display. If any of the files in your composed look are invalid, the thumbnail won’t display. If your master page exists but has no .preview file, the thumbnail won’t display. Diligence is important!

Adding Composed Looks using Elements.xml

In researching whether this was indeed possible, I came across an article by Tom Daly. All credit should go to him – I’ve simply tidied up a bit around his work. I already knoew that it was possible to create lists as part of a feature using only the elements.xml, and to place items in that new list. I hadn’t realised that adding items to an existing list also works.

In Visual Studio 2013 the process is easy – simply add a new item to your project, and in the Add New Item dialog select Office/SharePoint in the left column and Empty Element in the right. Visual Studio will create the new element with an Elements.xml ready and waiting for you.

To create our composed looks we simply edit that elements.xml file.

First we need to reference our list. As per Tom’s post, we need to add a ListInstance element to our file:

<ListInstance FeatureId="{00000000-0000-0000-0000-000000000000}" TemplateType="124" Title="Composed Looks" Url="_catalogs/design" RootWebOnly="FALSE">
</ListInstance>

That xml points to our existing list, and the url is a relative path so will reference the list in the current site for our feature, which is what we want.

Now we need to add at least one item. To do that we need to add Data and Rows elements to hold however many Row elements we have items:

<ListInstance FeatureId="{00000000-0000-0000-0000-000000000000}" TemplateType="124" Title="Composed Looks" Url="_catalogs/design" RootWebOnly="FALSE">
          <Data>
                    <Rows>
                    </Rows>
          </Data>
</ListInstance>

Then we add the following code for a single composed look:

<Row>
          <Field Name="ContentTypeId">0x0060A82B9F5D2F6A44A6A5723277B06731</Field>
          <Field Name="Title">My Composed Look</Field>
          <Field Name="_ModerationStatus">0</Field>
          <Field Name="FSObjType">0</Field>
          <Field Name="Name">My Composed Look</Field>
          <Field Name="MasterPageUrl">~site/_catalogs/masterpage/MyMasterPage.master, ~site/_catalogs/masterpage/MymasterPage.master</Field>
          <Field Name="ThemeUrl">~sitecollection/_catalogs/theme/15/MyColorTheme.spcolor, ~sitecollection/_catalogs/theme/15/MyColorTheme.spcolor</Field>
          <Field Name="ImageUrl"></Field>
          <Field Name="FontSchemeUrl"></Field>
          <Field Name="DisplayOrder">1</Field>
</Row>

There are two parts to the url fields – before the comma is the path to the file and after the comma is the description shown in the list dialog. I set both to the same, but the description could be something more meaningful if you like.

Note that the master page url uses ~site in the path, whilst the theme url uses ~sitecollection. Both of these will be replaced by SharePoint with the correct paths for the current site or site collection.

Note also that I have only specified master page and colour theme. The other two are optional, and SharePoint will use the default font scheme and no background image, respectively. The colour theme would appear to be mandatory because it is used in generating the thumbnail image in conjunction with the .preview file.

The DisplayOrder field affects where in the list of thumbnails our composed look appears. The out-of-the-box SharePoint themes start at 10 and the current theme is always 0. If more than one item has the same DisplayOrder they are displayed in the same order as in the composed looks list. Since I want my customisations to appear first I usually stick a value of 1 in there.

I have removed a couple of fields from the list that Tom specified, most notably the ID field, which SharePoint will generate a value for and (I believe) should be unique, so better to let it deal with that than potentially muck things up ourselves.

Deploying the Composed Look

Once we’ve created our elements.xml, getting the items deployed to our list is easy – simply create a feature and add that module to it. There are a few things I want to mention here:

  1. Tom suggests that the declarative approach does not create items more than once if a feature is reactivated. I have not found this to be the case – deactivate and reactivate the feature and you will end up with duplicate items. Not terrible, but worth knowing.
  2. You need a site level feature to add items to the composed looks list. As some of the things that list item references are at a site collection level, I suggest the following feature and module structure:
    1. Site Collection Feature
      1. Module: Theme files, containing .spcolor, .spfont and background image files. Deploys to _catalogs/Theme/15 folder.
      2. Module: Stylesheets. Deploys to Style Library/Themable folder or a subfolder thereof.
      3. Module: CSS Images. Deploys to Style Library/Themable folder or a subfolder thereof. Separating images referenced by my CSS is a personal preference as I like tidy VS projects!
      4. If you have web parts or search display templates I would put those in the site collection feature as well.
    2. Site Feature
      1. Module: Master pages. Contains .master and associated .preview files. Deploys to _catalogs/masterpage folder.
      2. Module: Page layouts. Contains .aspx layout files. Deploys to _catalogs/masterpage folder.
      3. Module: Composed Looks: Contains the list items in our elements.xml file. Deploys to Composed Looks list.

Speaking at NEBytes on February 19th

I’m pleased to have been asked to speak at NEBytes again – a great user group that meets in Newcastle. I’ll be speaking about customising SharePoint 2013 using master pages, themes and search templates, along the same lines as my recent blog  post.

It will be an unusual one for me, as I will spend most of the session inside Visual Studio showing how to create and deploy the customisations that can deliver really powerful solutions without needing to resort to writing code (other than for deployment).

The event on the 19th is in partnership with SUGUK and the other session of the night sounds really interesting too: Building social sharepoint apps using Yammer.

I’ve said before that I always enjoy visiting NEBytes. If you’re in the Newcastle area and are a developer or IT Pro I strongly recommend you find out more about them and consider attending.

See you there.

Using the Dell Venue 8 Pro Stylus

You will recall from my earlier post how much I like my Dell Venue 8 Pro and how disappointed I was that the stylus was on back-order until March.

Imagine my surprise, then, when a package arrived at the beginning of this week with a shiny new stylus in it!

WP_20140205_14_29_06_Raw

As you can see from the picture, it works just great with OneNote (and it’s desktop big brother).

The only niggle I feel obliged to point out right at the start is that the stylus requires a battery, which is an extremely obscure AAAA type. I can pick them up on Amazon, certainly, but I’ve never seen them anywhere else! I shall be ordering a pack ASAP as I have no idea yet how long I can expect the battery to last.

The stylus itself is comfortable to hold, perhaps actually helped by the battery as it is held at the nib-end. There is a button on the stylus that allows for left- and right-button clicks. Pressure sensitivity works well, although without the variation of the Wacom stylus that both my Surface Pro and X220T have. Palm rejection also works well enough for me to comfortable rest my hand on the tablet whilst writing.

I have been testing the tablet for taking handwritten notes in OneNote and then converting to text in OneNote desktop and it works better than I’d hoped. Ink-to-text is almost totally accurate, providing I remember to write in cursive rather than my usual block-capital scrawl.I really do believe that this was the purpose 8 inch Windows tablets fit best and nothing else really comes close to giving me a seamless workflow from note to document, coupled with the light weight, small size and flexibility to run desktop apps if I need to.

In addition to OneNote I have played with a marvellous app called Drawboard which allows you to create and annotate PDF files. It’s a really great Windows Store app that does what it sets out to do really well. Between those two I can both create content and review other people’s content very easily.

I said in my earlier post that there is nothing currently available that offers the functionality of the Venue 8 Pro and using it with a stylus really underlines that for me. I would look at the Asus VivoTab Note 8 as a possible alternative, but for serious business users I would not consider any competition that did not offer a ‘proper’ active stylus rather than the soft and saggy capacitive ones.

My only conundrum now is whether or not to get the folio case…