But it works on my PC!

The random thoughts of Richard Fennell on technology and software development

How long is my TFS 2010 to 2013 upgrade going to take?

Update 27 Jun 2013 See update version of post with more data

I seem to be involved with a number of TFS 2010 to 2013 upgrades at present. I suppose people are looking at TFS 2013 in the same way as they have historically looked at the first service pack for a product i.e: the time to upgrade when most of the main issues are addressed. That said TFS 2013 is not TFS 2012 SP1!

A common question is how long will the process take to upgrade each Team Project Collection? The answer is that it depends, a good consultants answer. Factors include the number of work items, size of the code base, number of changesets, volume of test results and the list goes on.  The best I have been able to come up with is to record some timings of previous upgrades and use this data to make an educated guess. 

In an upgrade of a TPC from TFS 2010 to 2013 there are 793 steps to be taken. Not all these take the same length of time, some are very slow as can be seen in the chart. I have plotted the points where the upgrade seems to pause the longest. These are mostly towards the start of the process where I assume the main  DB schema changes are being made

image

To give some more context

  • Client C was a production quality multi tier setup and took about 3 hours to complete.
  • Client L, though with a a similar sized DB to Server A, was much slower to upgrade, around 9 hours. However, it was on a slower single tier test VM and also had a lot of historic test data attachments (70%+ of the DB contents)
  • Demo VM was my demo/test TFS 2010 VM, this had 4 TPCs, the timing are for the largest of 600Mb. In reality this server had little ‘real’ data. It is also interesting to note that though there were four TPCs the upgrade did three in parallel and when the first finished started the fourth. Worth remembering if you are planning an upgrade of many TPCs.

Given this chart, if you know how long it takes to get to Step 30 of 793 you can get an idea of which of these lines closest matches your system.

I will continue to update this post as I get more sample data, hope it will be of use to others to gauge how only upgrades may take, but remember your mileage may vary.

Fix for intermittent connection problem in lab management – restart the test controller

Just had a problem with a TFS 2012 Lab Management deployment build. It was working this morning, deploying two web sites via MSDeploy and a DB via a DacPac, then running some CodedUI tests. However, when I tried a new deployment this afternoon it kept failing with the error:

The deployment task was aborted because there was a connection failure between the test controller and the test agent.

image

If you watched the build deployment via MTM you could see it start OK, then the agent went off line after a few seconds.

Turns out the solution was the old favourite, to reboot of the Build Controller. Would like to know why it was giving this intermittent problem though.

Update 14th Jan An alternative solution to rebooting is to add a hosts file entry on the VM running the test agent for the IP address of the test controller. Seems the problem is name resolution, but not sure why it occurs

Fix for Media Center library issue after Christmas tree lights incident

Twas the night before Christmas and….

To cut a long story short the PC that runs my Window Media Center (MCE) got switched on and off at the wall twice whilst Christmas tree lights were being put up.

Now the PC is running WIndows 8.1 on modern hardware, so it should have been OK, and mostly was. However I found a problem that MCE was not showing any music, video or pictures in its libraries but the recorded TV library was fine. I suspected the issue was that my media is on an external USB3 RAID unit, so there was a chance that on one of the unintended reboots the drives had not spun up in time and MCE had ‘forgotten’ about the external drive.

So I tried to re-add the missing libraries via MCE > Tasks > Settings > Media Libraries. The wizard ran OK allowing me to select the folders on the external disk, but when I got to the end the final dialog closed virtually instantly. I would normally have expected it to count up all the media files as they were found. Also if I went back into the wizard I could not see the folder I had just added.

A bit of searching on the web told me that MCE shares its libraries with Windows Media Player, and there was a a good chance they were corrupted. In fact running the Windows Media Player trouble-shooter told me as as much. So I deleted the contents of %LOCALAPPDATA%\Microsoft\Media Player folder as suggested. It had no useful effect on the problem. The only change was the final dialog in the wizard did appear to count the media files it found now, taking a few minutes before it closed. But the results of the scan were not saved.

So I switched my focus to Media Player (WMP). I quickly saw this was showing the same problems. If I selected WMP > Organise > Manage libraries no dialog was shown for music, video or pictures. However the dialog did appear for Recorded TV which we know was working in MCE.

image

Also if I selected WMP > Organise > Options… > Rip Music, there was no rip location set, and you could not set it if you pressed the Change button.

image

The web quickly showed me I was not alone in this problem, as shown in this post and others on the Microsoft forums. It is worth noting that this thread, and the others, do seem to focus on Windows 7 or Vista. Remember I was on a PC that was a new install of Windows 8 and in place upgraded to 8.1 via the Windows Store, but I don’t think was the issue.

Anyway I tried everything I could find the posts

  • Restarted services
  • Deleted the WMP databases (again)
  • Uninstalled and re-install WMP via the WIndows Control panel > Install Products > Windows feature
  • Checked the permissions on folder containing the media

Everything seemed to point to a missing folder. The threads talked about WMP being set to use a Rip folder that it could not find. As my data was on an external RAID this seemed reasonable. However on checking [HKEY_CURRENT_USER\Software\Microsoft\MediaPlayer\Preferences\HME\LastSharedFolders] there were no paths that could not be resolved.

So I decided to have a good look at what was going on under the covers with Sysinternals Procmon, but could see nothing obvious, no missing folders, not registry key calls missed.

In the end the pointer to the actual fix was on page 8 of the thread by Tim de Baets. Turns out the issue was with the media libraries in C:\Users\<your username>\AppData\Roaming\Microsoft\Windows\Libraries. If I tried to a open any of these in Windows Explorer I got an error dialog in the form 'Music-library-ms' is not longer working. So I deleted the Pictures, Music and Video library folders in C:\Users\<your username>\AppData\Roaming\Microsoft\Windows\Libraries, which was not a problem as they were all empty.

When I reloaded WMP I could now open the WMP > Organise > Manage libraries dialogs and re-add the folders on my RAID disk, also I could set the Rip folder.

As these settings were shared with MCE my problem was fixed, ready for a Christmas of recording TV, looking at family photos and playing music.

Whether it was the power outages that caused the problem, I have my doubts, as power cuts have not been an issue in the past. maybe it is some strange permission hangover from the upgrade from Windows 8 > 8.1 I doubt I will ever find out.

Getting the domain\user when using versionControl.GetPermissions() in the TFS API

 

If you are using the TFS API to get a list of user who have rights in a given version control folder you need to be careful as you don’t get back the domain\user name you might expect from the GetPermissions(..) call. You actually get the display name. Now that might be fine for you but I needed the domain\user format as I was trying to populate a peoplepicker control.

The answer is you need to make a second call to the TFS IIdentityManagementService  to get the name in the form you want.

This might not be best code, but shows the steps required

private List<string> GetUserWithAccessToFolder(IIdentityManagementService ims, VersionControlServer versionControl, string path)
{
    var users = new List<string>();
    var perms = versionControl.GetPermissions(new string[] { path }, RecursionType.None);
    foreach (var perm in perms)
    {
        foreach (var entry in perm.Entries)
        {
                var userIdentity = ims.ReadIdentity(IdentitySearchFactor.DisplayName,
                                                        entry.IdentityName,
                                                        MembershipQuery.None,
                                                        ReadIdentityOptions.IncludeReadFromSource);

                users.Add(userIdentity.UniqueName);
          }
    }

    return users;
}

A hair in the gate

My Arc mouse started behaving strangely today, very jumpy. Felt like the cursor was being pulled left. Turns out the problem was a tiny hair caught in the led sensor slot

image

You could see there was a problem as the led was flashing a lot, when it is normally solidly on if turn over the mouse you look into the slot.

Once I got it out all was fine again

Fix for 0xc00d36b4 error when play MP4 videos on a Surface 2

Whilst in the USA last week I bought a Surface 2 tablet. Upon boot it ran around 20 updates, as you expect, but unfortunately one of these seemed to remove its ability to play MP4 videos, giving a 0xc00d36b4 error whenever you try. A bit of a pain as one of the main reasons I wanted a tablet was for watching training videos and PluralSight on the move.

After a fiddling and hunting on the web I found I was not alone, so I added my voice to the thread, and eventually an answer appeared. It seems the Nvidia Audio Enhancements seem to be the problem. I guess they got updated within the first wave of updates.

So the fix is according to the thread is as follows

  1. Go to the desktop view on your Surface
  2. Tap and hold the volume icon. 
  3. Select sounds from the pop up menu - I only had to go this far as a dialog appeared asking of I wished to disable audio enhancements (maybe it found it was corrupt)
  4. Go to the playback tab
  5. Highlight the speakers option
  6. Select properties
  7. Go to the enhancements tab
  8. Check the "Disable all enhancements" box
  9. Tap OK.

And videos should now play

Updated 2 Dec  2013 Seems you have to make this change for each audio device, this means speaker AND headphones

Fixing a WCF authentication schemes configured on the host ('IntegratedWindowsAuthentication') do not allow those configured on the binding 'BasicHttpBinding' ('Anonymous') error

Whilst testing a WCF web service I got the error

The authentication schemes configured on the host ('IntegratedWindowsAuthentication') do not allow those configured on the binding 'BasicHttpBinding' ('Anonymous'). Please ensure that the SecurityMode is set to Transport or TransportCredentialOnly. Additionally, this may be resolved by changing the authentication schemes for this application through the IIS management tool, through the ServiceHost.Authentication.AuthenticationSchemes property, in the application configuration file at the <serviceAuthenticationManager> element, by updating the ClientCredentialType property on the binding, or by adjusting the AuthenticationScheme property on the HttpTransportBindingElement.

Now this sort of made sense as the web services was mean to be secured using Windows Authentication, so the IIS setting was correct, anonymous authentication was off

image

Turns out the issue was, as you might expect, an incorrect web.config entry

  <system.serviceModel>
    <bindings>
      <basicHttpBinding>
        <binding name="windowsSecured"> <!—this was the problem –>
          <security mode="TransportCredentialOnly">
            <transport clientCredentialType="Windows" />
          </security>
        </binding>
      </basicHttpBinding>
  </bindings>
    <services>
      <service behaviorConfiguration="CTAppBox.WebService.Service1Behavior" name="CTAppBox.WebService.TfsService">
        <endpoint address="" binding="basicHttpBinding"  contract="CTAppBox.WebService.ITfsService">
          <identity>
            <dns value="localhost"/>
          </identity>
        </endpoint>
        <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange"/>
      </service>
    </services>
    <behaviors>
      <serviceBehaviors>
        <behavior name="CTAppBox.WebService.Service1Behavior">
          <!-- To avoid disclosing metadata information, set the value below to false before deployment -->
          <serviceMetadata httpGetEnabled="true"/>
          <!-- To receive exception details in faults for debugging purposes, set the value below to true.  Set to false before deployment to avoid disclosing exception information -->
          <serviceDebug includeExceptionDetailInFaults="true"/>
        </behavior>
      </serviceBehaviors>
    </behaviors>
  </system.serviceModel>

The problem was the basicHttpBinding had a named binding windowsSecured and no non-named default. When the service was bound to the binding it did not use the name binding, just the defaults (which were not shown in the config file).

The solution was to remove the name="windowsSecured" entry, or we could have added a name to the service binding

When your TFS Lab test agents can’t start check the DNS

Lab Management has a lot of moving parts, especially if you are using SCVMM based environments. All the parts have to communicate if the system is work.

One of the most common problem I have seen are due to DNS issues. A slowly propagating DNS can cause chaos as the test controller will not be able to resolve the name of the dynamically registered lab VMs.

The best fix is to sort out your DNS issues, but that is not always possible (some things just take the time they take, especially on large WANs).

An immediate fix is to use the local host files on the test controller to define IP address for the lab[guid].corp.domain names created when using network isolation. Once this is done the handshake between the controller and agent is usually possible.

If it isn’t then you are back to all the usually diagnostics tools

More on TF215106: Access denied from the TFS API after upgrade from 2012 to 2013

In my previous post I thought I had fixed my problems with TF215106 errors

"TF215106: Access denied. TYPHOONTFS\\TFSService needs Update build information permissions for build definition ClassLibrary1.Main.Manual in team project Scrum to perform the action. For more information, contact the Team Foundation Server administrator."}

Turns out I had not, acutally I not idea why it worked for a while! There could well be an API version issue, but I had to actually also missed I needed to do what the error message said!

If you check MSDN, it tells you how to check the permissions for a given build; on checking I saw that the update build information permission was not set for the build in question.

image

Once I set it for the domain account my service was running as, everything worked as expected.

All I can assume that there is a change from TSF 2012 to 2013 over defaulting the permission as I have not needed to set it explicitly in the past

Changes in Skydrive access on Windows 8.1

After upgrading to Windows 8.1 on my Media Center PC I have noticed a change in SkyDrive. The ‘upgrade’ process from 8 to 8.1 is really a reinstall of the OS and reapplication of Windows 8 applications. Some Windows desktop applications are removed. In the case of my Media Center PC the only desktop app installed was Windows Desktop Skydrive that I used to sync photos from my MediaPC to the cloud. This is no longer needed as Windows 8.1 exposes the Skydrive files linked to the logged in LiveID as folders under the c:\user\[userid]\documents folder, just like Windows Desktop client used to do.

This means though the old dekstop Skydrive client has been removed my existing timer based jobs that backup files to the cloud by copying from a RAID5 box to the local Skydrive folder still work.

A word of warning here though, don’t rely on this model as you only backup. There is a lot of ransonware around at the moment and if you aren't careful an infected PC can infect your automated cloud backup too. Make your you cloud backup is versioned so you can old back to a pre-infected file and/or you have a more traditional offline backup too.