The blogs of Black Marble staff

Declaratively create Composed Looks in SharePoint 2013 with elements.xml

This is really a follow-up to my earlier post about tips with SharePoint publishing customisations. Composed looks have been a part of a couple of projects recently. In the first, a solution for on-premise, we used code in a feature receiver to add a number of items to the Composed Looks list. In the second, for Office 365, a bit of research offered an alternative approach with no code.

What are Composed Looks

A composed look is a collection of master page, colour scheme file, font scheme file and background image. There is a site list called Composed Looks that holds them, and they are shown in the Change the Look page as the thumbnail options you can choose to apply branding in one hit.

In order to get your new composed look working there are a few gotchas you need to know:

  1. When you specify a master page in your composed look, there must be a valid .preview file with the same name. This file defines the thumbnail image – if you look at an existing file (such as seattle.preview or olso.preview) you will find html and styling rules, along with some clever token replacement that references colours in the color scheme file.
  2. A composed look must have a master page and colour scheme (.spcolor) file, but font scheme and background image are optional.
  3. When using sites and site collections, files are split between local and root gallery locations:
    1. The Composed look List is local to the site – it doesn’t inherit from the parent site.
    2. Master pages go in the site Master Page Gallery.
    3. Spcolor, sptheme and image files go in the site collection master page gallery.

If any of the files you specify in your composed look don’t exist (or you get the url wrong), the thumbnail won’t display. If any of the files in your composed look are invalid, the thumbnail won’t display. If your master page exists but has no .preview file, the thumbnail won’t display. Diligence is important!

Adding Composed Looks using Elements.xml

In researching whether this was indeed possible, I came across an article by Tom Daly. All credit should go to him – I’ve simply tidied up a bit around his work. I already knoew that it was possible to create lists as part of a feature using only the elements.xml, and to place items in that new list. I hadn’t realised that adding items to an existing list also works.

In Visual Studio 2013 the process is easy – simply add a new item to your project, and in the Add New Item dialog select Office/SharePoint in the left column and Empty Element in the right. Visual Studio will create the new element with an Elements.xml ready and waiting for you.

To create our composed looks we simply edit that elements.xml file.

First we need to reference our list. As per Tom’s post, we need to add a ListInstance element to our file:

<ListInstance FeatureId="{00000000-0000-0000-0000-000000000000}" TemplateType="124" Title="Composed Looks" Url="_catalogs/design" RootWebOnly="FALSE">

That xml points to our existing list, and the url is a relative path so will reference the list in the current site for our feature, which is what we want.

Now we need to add at least one item. To do that we need to add Data and Rows elements to hold however many Row elements we have items:

<ListInstance FeatureId="{00000000-0000-0000-0000-000000000000}" TemplateType="124" Title="Composed Looks" Url="_catalogs/design" RootWebOnly="FALSE">

Then we add the following code for a single composed look:

          <Field Name="ContentTypeId">0x0060A82B9F5D2F6A44A6A5723277B06731</Field>
          <Field Name="Title">My Composed Look</Field>
          <Field Name="_ModerationStatus">0</Field>
          <Field Name="FSObjType">0</Field>
          <Field Name="Name">My Composed Look</Field>
          <Field Name="MasterPageUrl">~site/_catalogs/masterpage/MyMasterPage.master, ~site/_catalogs/masterpage/MymasterPage.master</Field>
          <Field Name="ThemeUrl">~sitecollection/_catalogs/theme/15/MyColorTheme.spcolor, ~sitecollection/_catalogs/theme/15/MyColorTheme.spcolor</Field>
          <Field Name="ImageUrl"></Field>
          <Field Name="FontSchemeUrl"></Field>
          <Field Name="DisplayOrder">1</Field>

There are two parts to the url fields – before the comma is the path to the file and after the comma is the description shown in the list dialog. I set both to the same, but the description could be something more meaningful if you like.

Note that the master page url uses ~site in the path, whilst the theme url uses ~sitecollection. Both of these will be replaced by SharePoint with the correct paths for the current site or site collection.

Note also that I have only specified master page and colour theme. The other two are optional, and SharePoint will use the default font scheme and no background image, respectively. The colour theme would appear to be mandatory because it is used in generating the thumbnail image in conjunction with the .preview file.

The DisplayOrder field affects where in the list of thumbnails our composed look appears. The out-of-the-box SharePoint themes start at 10 and the current theme is always 0. If more than one item has the same DisplayOrder they are displayed in the same order as in the composed looks list. Since I want my customisations to appear first I usually stick a value of 1 in there.

I have removed a couple of fields from the list that Tom specified, most notably the ID field, which SharePoint will generate a value for and (I believe) should be unique, so better to let it deal with that than potentially muck things up ourselves.

Deploying the Composed Look

Once we’ve created our elements.xml, getting the items deployed to our list is easy – simply create a feature and add that module to it. There are a few things I want to mention here:

  1. Tom suggests that the declarative approach does not create items more than once if a feature is reactivated. I have not found this to be the case – deactivate and reactivate the feature and you will end up with duplicate items. Not terrible, but worth knowing.
  2. You need a site level feature to add items to the composed looks list. As some of the things that list item references are at a site collection level, I suggest the following feature and module structure:
    1. Site Collection Feature
      1. Module: Theme files, containing .spcolor, .spfont and background image files. Deploys to _catalogs/Theme/15 folder.
      2. Module: Stylesheets. Deploys to Style Library/Themable folder or a subfolder thereof.
      3. Module: CSS Images. Deploys to Style Library/Themable folder or a subfolder thereof. Separating images referenced by my CSS is a personal preference as I like tidy VS projects!
      4. If you have web parts or search display templates I would put those in the site collection feature as well.
    2. Site Feature
      1. Module: Master pages. Contains .master and associated .preview files. Deploys to _catalogs/masterpage folder.
      2. Module: Page layouts. Contains .aspx layout files. Deploys to _catalogs/masterpage folder.
      3. Module: Composed Looks: Contains the list items in our elements.xml file. Deploys to Composed Looks list.

High CPU utilisation on the data tier after a TFS 2010 to 2013 upgrade

There have been significant changes in the DB schema between TFS 2010 and 2013. This means that as part of an in-place upgrade process a good deal of data needs to be moved around. Some of this is done as part of the actual upgrade process, but to get you up and running quicker, some is done post upgrade using SQL SPROCs. Depending how much data there is to move this can take a while, maybe many hours. This is the cause the SQL load.

A key factor as to how long this takes is the size of your pre upgrade tbl_attachmentContent table, this is where amongst other things test attachments are stored. So if you have a lot of test attachments it will take a while as these are moved to their new home in tbl_content.

If you want to minimise the time this takes it can be a good idea to remove any unwanted test attachments prior to doing the upgrade. This is done with the test attachment cleaner from the appropriate version of TFS Power Tools for your TFS server. However beware that if you don’t have a suitably patched SQL server there can be issues with ghost files (see Terje’s post).

If you cannot patch your SQL to a suitable version to avoid this problem then it is best to clean our old test attachments only after the while TFS migrate has completed i.e. wait until the high SQL CPU utilisation caused by the SPROC based migration has completed. You don’t want to be trying to clean out old test attachments at the same time TFS is trying to migrate them.

My Domain Controller Doesn’t Think It’s a Domain Controller

I’ve been helping our other Tester Tom Barnes on a project he’s been lead tester on for a couple of months, mostly running acceptance tests here and there when I’ve had a spare couple of minutes.

As mentioned in previous posts (and in Richard and Rik’s blogs) we use a lot of SCVMM virtual environments at black marble, presented through TFS Lab Management. This project was no different, our test environment consisting of a DC, SQL Server and Several SharePoint servers

So today I thought, while waiting for a big database operation to finish on another project, I’d test another user story for functional completeness. I remoted onto one of a client VM’s (which point at the SharePoint web server via host file configuration) and resumed my session from the previous day.

None of the websites I was intending to test were working, 404’s around the board. My immediate thought was to check the SharePoint server to see whether a deployment had gone amiss. I attempted to remote onto the SharePoint server using the SP Admin account, only to be told my password was incorrect. So I tried the domain admin account and ran into the same problem. Once again no luck.

I thought to check the domain controller since I knew we’d been running PowerShell scripts which modify password behaviour in AD, I was hoping someone hadn’t accidentally turned on the password expiry policy.

I couldn’t login to the DC with the Domain Admin. Lab then thought to give me a further bit of worrying information “Lab cannot determine whether the machine you are trying to connect to is a DC or joined to a domain”.

To quote Scooby Doo “Ruh Oh!”

I logged in using the machine admin account and the problem became fairly obvious on logging in, the desktop was quite helpful in informing that….

The DC was running in Safe-Mode.

For those unaware of what Safe-Mode does, it disables a lot of services and programs from starting up, in the case of a DC one of these is Active Directory Domain Services (and probably a host more). No AD Domain Services, no authentication, no authentication means lots of other services/applications which use dedicated service accounts fall flat on their face. Notable examples being:

  • CRM
  • SharePoint
  • SQL
  • TFS

So for all intents and purposes, my DC was not actually behaving like a Domain Controller.

So I restarted it….and it started in Safe Mode again…much to my annoyance. It did this without fail during successive restarts, no option on start-up was given to not start in Safe Mode and nothing in the event logs indicated the system had suffered a catastrophic problem on start-up for it to boot into Safe Mode.

Some quick Google-Fu showed the the problem, and more importantly how to fix it.


Something or Someone had turned Safe Boot on in System Configuration. Funnily enough turning this off meant the DC booted normally! You can find System Configuration in Server 2012 by using a normal search on pressing the Windows key.

Still haven’t found out what turned it on in the first place mind, but I’ll burn that bridge down if I have to cross it again.

Anyhow thanks for reading.


Speaking at NEBytes on February 19th

I’m pleased to have been asked to speak at NEBytes again – a great user group that meets in Newcastle. I’ll be speaking about customising SharePoint 2013 using master pages, themes and search templates, along the same lines as my recent blog  post.

It will be an unusual one for me, as I will spend most of the session inside Visual Studio showing how to create and deploy the customisations that can deliver really powerful solutions without needing to resort to writing code (other than for deployment).

The event on the 19th is in partnership with SUGUK and the other session of the night sounds really interesting too: Building social sharepoint apps using Yammer.

I’ve said before that I always enjoy visiting NEBytes. If you’re in the Newcastle area and are a developer or IT Pro I strongly recommend you find out more about them and consider attending.

See you there.

A walkthrough of getting Kerberos working with a Webpart inside SharePoint accessing a WCF service

Update 2/4/2014 – Added notes about using service accounts as opposed to machine accounts for the AppPool running the web service

In the past I have posted on how to get Kerberos running for multi tier applications. Well as usual when I had to redeploy the application onto new hardware I found my notes were not as clear as I would have hoped. So here is what is meant to be a walkthrough for getting our application working in our TFS lab environment.

What we are building

Our lab is a four box system, running in a test domain proj.local


  • ProjDC – the domain controller for the proj.local domain
  • ProjIIS75 – a web server hosting our WCF web service
  • ProjSQL2008R2 – the SQL box for the applications in the domain
  • ProjSP2010 –  a SharePoint server

The logical system we are trying to build is a SharePoint site with a webpart that calls a WCF service which in turn makes calls to a SQL database. We need the identity the user logs into SharePoint server as to be passed to WCF service via impersonation.


Though not important to this story, all this was all running a TFS Lab management infrastructure as a network isolated environment

Application Deployment

We have to deploy a number of layers for our application



  1. Using a SSDT DACPAC deployment we created a new DB for our application on ProjSQL2008R2
  2. We grant the account (in this case a machine account proj\ProjIIS75$) owner access to this DB (the WCF service will run as this account)


WCF Service

  1. Using MSDeploy we deploy a new copy of our WCF web site onto ProjIIS75.
  2. We bound this to port 8081
  3. We set the AppPool set to run as Network Service (the proj\ProjIIS75$  account we just granted DB access to)
    Updated note: You can use a domain service account here e.g. proj\appserviceaccount, but if you do this the Kerberos setting should be applied to the service account not the machine account)
  4. We made sure the web site authentication is set enable for anonymous authentication, ASP.NET impersonation and  windows authentication
  5. Set the DB connection string to point to the new DB on ProjSql2008R2, and other server specific AppSettings, in the web.config
  6. Made sure port 8081 was open on the firewall



  1. Add the WSP solution containing our front end  to the SharePoint farm (you can use STSadm or powershell commands to do this)
  2. Using SharePoint Central Admin we deployed this solution to the web application
  3. Activated the feature on the site the solution has been deployed to.
  4. Create a new web page to host the webpart e.g. http://share2010.proj.local/sitepages/mypage.aspx (Note here the name we use to access this SharePoint site is share2010 not ProjSp2010. This host name is resolved via the DNS on ProjDC of our lab environment. This lab setup has a fully configured SharePoint 2010 with a number of web applications each with their own name and associated service accounts, this is important later on)
  5. We added our webpart to the page and set the webpart properties to
    • The Url for the WCF web service http://ProjIIS75.proj.local:8081/callservice.svc
    • The SPN for the WCF web service http/ProjIIS75.proj.local:8081

Note: we provide the URL and SPN as a parameters as we build the WCF connection programmatically within the webpart. This is as it would be awkward to put this information in a web.config file on a multi server SharePoint farm and we don’t want to hard code them.

Our Code

The WCF service is configured via its web.config

           <binding name="MyBinding">
          <security mode="Message">
            <message clientCredentialType="Windows" negotiateServiceCredential="false" establishSecurityContext="false" />
      <service behaviorConfiguration="BlackMarble.Sabs.WcfService.CallsServiceBehavior" name="BlackMarble.Sabs.WcfService.CallsService">
        <endpoint address="" binding="wsHttpBinding" contract="BlackMarble.Sabs.WcfService.ICallsService" bindingConfiguration="MyBinding"></endpoint>
        <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" />
        <behavior name="BlackMarble.Sabs.WcfService.CallsServiceBehavior">
          <serviceMetadata httpGetEnabled="true" />
          <serviceDebug includeExceptionDetailInFaults="true" />
          <serviceAuthorization impersonateCallerForAllOperations="true" />

The webpart does the same programmatically

log.Trace(String.Format(“Using URL: {0} SPN: {1} ", this.callServiceUrl, this.callServiceSpn));
var callServiceBinding = new WSHttpBinding();
callServiceBinding.Security.Mode = SecurityMode.Message;
callServiceBinding.Security.Message.ClientCredentialType = MessageCredentialType.Windows;
callServiceBinding.Security.Message.NegotiateServiceCredential = false;
callServiceBinding.Security.Message.EstablishSecurityContext = false;
var  ea = new EndpointAddress(new Uri(this.callServiceUrl),  EndpointIdentity.CreateSpnIdentity(this.callServiceSpn));
callServiceBinding.MaxReceivedMessageSize = 2000000;
callServiceBinding.ReaderQuotas.MaxArrayLength = 2000000;

this.callServiceClient = new BlackMarble.Sabs.WcfWebParts.CallService.CallsServiceClient(callServiceBinding, ea);
this.callServiceClient.ClientCredentials.Windows.AllowedImpersonationLevel = TokenImpersonationLevel.Impersonation;

Getting the Kerberos bits running

First remember that this is a preconfigured test lab where the whole domain, including the SP2010 instance, is already setup for Kerberos authentication. These notes just detail the bits we need to alter to check.

To make sure out new WCF series works in this environment we needed to do the following. All this editing can be done on the domain controller

  1. Using ASDIEDIT, make sure the the computer running the WCF web service, ProjIIS75, has any entry in it’s ServicePrincipalName for the correct protocol and port i.e. HTTP/projiis75.proj.local:8081
    Update note: If using a service account as opposed to the machine account, network service, you make the same ServicePrincipalName edits but to the service account proj\appserviceaccount.
    You should only add an SPN entry in one place, if you enter it in two nothing will work, so make sure the SPN is applied to the account the AppPool will run as where it be the machine account if you using network service or the service account if a domain account is being used.

  2. Using Active Directory  Users and Computers tool make sure the computer running the WCF web service, ProjIIS75, is set to allow delegation
    Update note: If using a service account as opposed to the machine account, network service, you make the same edits but to the service account proj\appserviceaccount.

  3. Using Active Directory Users and Computers tool make sure the service account running the Sharepoint web application, in our case proj\sp2010_share,  is set to allow Kerberos delegation to the computer SPN set in step 1. HTTP/projiis75.proj.local:8081. To do this you press the add button, select the correct server then pick the SPN from the list.


IMPORTANT Now you would expect that you could just set the ‘Trust  the user for delegation to any service’; however we were unable to get this to work. Now this might just be something we set wrong, but if so I don’t know what it was.

Once this was all set we did an IIS reset on ProjSP2010 and reloaded the SharePoint page and it all leapt into life.

How to try to debug when it does not work

There is no simple answer to how to debug this type of system, if it fails it just seems to not work and you are left scratching your head. The best option is plenty of in product logging which I tend to surface using DebugView, also WCFStorm can be useful to check the WCF service is up


So I hope I find this post useful when I next need to rebuild this system. Maybe someone else will find it useful too.

Speaking at Gravitas’s Tech Talk #3 -

I am speaking at Gravitas’s Tech Talk #3 - "TFS & Visual Studio 2013 vs The Rest" on Tuesday march the 4th about

"Microsoft's Application Lifecycle product has had a lot of changes in the past couple of years. In this session will look at how it can be used provide a complete solution from project inception through development, testing and deployment for project both using Microsoft and other vendor technologies"

Hope to see you there