BM-Bloggers

The blogs of Black Marble staff

Empty groups not being expanded in a combobox for a TFS work item

A common work item type (WIT) edit in TFS is to limit the list of names shown in a combo to the users assigned to the project i.e. the members of the Team Projects Contributors and Project Administrators groups.

This is done by editing the WIT either via your favourite XML editor or the Process Template Editor (part of the power tools). You edit the Allowedvalues for the field you wish to limit such as the Assigned To as shown below,

image

Which gives the following XML behind the scenes (for those using XML editors)

<ListRule filteritems="excludegroups">
  <LISTITEM value="[Project]\Contributors" />
  <LISTITEM value="[Project]\Project Administrators" />
  <LISTITEM value="Unassigned" />
</ListRule>

Notice that Expand Items and Exclude Groups are checked. This means that the first two lines in the list will be expanded to contain the names in the groups, not the group names themselves.

A small gotcha here is that if either of the groups are empty you do see the group name in the combobox list, even with the Exclude Groups checked. Team Explorer does not expand an empty list to be a list with no entries, it show the group name. So you would see in the combo something like

  • [MyProject]\Contributors
  • John
  • Fred
  • Unassigned

where John and Fred as project administrators and the [MyProject]\Contributors group is empty.

This should not be a serious issue as in most cases why would you have a Team Project with no contributors or administrators? However it is conceivable with more complex security models you might see this issue. if so make sure each group in the list has at least one member, again if it does not have any members do you really need it?

Tempted by the new Kindle?

I am back on the should I buy a Kindle train of thought. Todays announcements are certainly interesting, I am not talking so much about the new Kindle Fire, but the new entry level version and the Touch. For me the tempting feature is still the E-Ink and battery life.

The point is I have got used to reading on my phone, a Kindle might be easier on the eye, but it is more kit to carry, and I just don’t think I want to carry any more things.

Can we Build-IT?

startBlack Marble’s Robert Hogg, Steve Spencer and Rik Hepworth flew all the way to Anaheim in California in order to bring you the highlights and key announcements build_logoaround Windows 8!

On 12 October, at The Holiday Inn in Yorkshire, we will bring our take on the news for IT Professionals in the morning, and for developers in the afternoon.

Come for the whole day – a great lunch is provided!

“We reimagined Windows,” said Steven Sinofsky, president of the Windows and Windows Live Division at Microsoft, in his keynote address to the thousands of developers in attendance. “From the chipset to the user experience, Windows 8 brings a new range of capabilities without compromise.”

Just posted VirtualPC activity documentation for TFS 2010 Community Build Extensions

I have just posted new VirtualPC activity documentation for TFS 2010 Community Build Extensions. This has been a really nasty set of documentation to write as getting this activity running raises a lot of issues over COM security; thanks to Rik and Andy (our SharePoint specialists at Black Marble who are therefore used to COM problems!) who helped get to the bottom the issues.

The best thing I can say about this VirtualPC activity (and I wrote much of it) is don’t use it. Much better to use the Hyper-V one it is far more flexible, allowing control of remotely hosted VMs, or even better use TFS Lab Management

Upcoming Black Marble event on Windows 8

In case you did not make it to the Microsoft Build Conference, Black Marble are running a pair of free events in Leeds on the 12th of October on Windows 8 and the other announcements made in Anaheim earlier this month.

The morning session is focused on the IT pro side and the afternoon on development, so why not make a day of it?

To get more information, and to register for these free events, have a look at http://www.blackmarble.co.uk/Events

Syncing the build number and assembly version numbers in a TFS build when using the TFSVersion activity

Updated 27 July 2013 - Here is a potential solution

Update 27 Sep 2011 – this seemed such as good idea when I initially tried it out, but after more testing I see changing the build number part way through a build causes problems. The key one being that when you queue the next build it is issued the same revision number [the $(Rev:.r) in the BuildNumberFormat] as the just completed build, this will fail with the error

TF42064: The build number 'BuildCustomisation_20110927.17 (4.5.269.17)' already exists for build definition '\MSF Agile\BuildCustomisation'.

This failed build causes the revision to increment so the next build is fine, and the cycle continues. Every other build works

After a bit more thought the only option I can see to avoid this problems is the one Ewald originally outlined i.e. a couple of  new activities that are run on the controller, one to get the last assembly version and the second to override the build number generator that are run prior to everything else.

So I have struck out some bits of the post and made it clear where there are issues, but I wanted to leave it available as I think it does show how it is easy to get to point that you think is working, but turns out there are unexpected problems

[Original content follows]


I was asked recently if it was possible to make the TFS build number the same format as the assembly build number when you are using the TFSVersion community extension. The answer is yes and no; the issue is that the build drop location in the standard build process template is create before any files are got into the workspace from source control. So at this point you don’t have new version number, a bit of a problem.

A detailed description of one approach to this general problem can be found in Ewald Hofman’s blog on the subject, but to use this technique you end up creating another custom activity to update the build number using your own macro expansion. Also there will be some major workflow changes to make sure directories are created with the correct names.

<Following Does not Work>

There following instructions do not give the desired result for the reasons outlined at the top of this updated post

I was looking for a simpler (lazier) solution when using the TFSVersion activity and the one I came up with to just update the build name, and do it after the drop directory was created. So that I did not end up with two completely different names for the drop folder and build I just append the version number to the existing build number. This is done by re-running the Update Build Number activity. I added this new activity to example workflow logic from the TFSVersion documentation, ending up with something like

image

where the Update Build Number has the following properties

image

So we just append the newly generated version number to the existing build number.

string.Format("{0} ({1})", BuildDetail.BuildNumber , VersionNumber)

Note you can’t use the BuildNumberFormat.in this string format as this string contains the $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.r) text that via macro expansion is used to generate the build number. The problems if this expansion if used in this string format you get the error

The revision number $(Rev:.r) is allowed to occur only at the end of the format string.

So it is easier to use the previously generated build number.

This also has the effect that drop folder retains the original name but all the reports contain the old folder name with the generated build number in brackets, and the Open Drop Folder link still works

image

I reasonable compromise I think, not too much change from a standard template.

Does not work as only alternate build work, see top of post

</Following Does not Work>

New Release of the Community TFS 2010 Build Extensions

Mike Fourie has just announced that we’ve just shipped the second stable release of the Community TFS 2010 Build Extensions. Well worth a look if you need to customised your TFS 2010 build with any of the following

  • AssemblyInfo
  • BuildReport
  • BuildWorkspace
  • CodeMetric
  • DateAndTime
  • Email
  • File
  • GetBuildController
  • GetBuildDefinition
  • GetBuildServer
  • GetWebAccessUrl
  • Guid
  • Hello
  • HyperV
  • IIS7
  • ILMerge
  • InvokePowerShellCommand
  • nUnit
  • QueueBuild
  • RoboCop
  • SqlExecute
  • StyleCop
  • TFSVersion
  • VB6
  • VirtualPC
  • VSDevEnv
  • Wmi
  • WorkItemTracking
  • Zip

Session State in Windows Azure

We recently moved a web application into Windows Azure that was using session state. As it was running on a single webserver the session state was set to InProc but this is not useful when in a multi-server environment as the session is stored on the specific machine and is therefore not accessible to other machines. There were a number of options:

  1. Use the Windows AppFabric Caching service (http://msdn.microsoft.com/en-us/library/windowsazure/gg278339.aspx)
  2. Use SQL Azure (http://blogs.msdn.com/b/sqlazure/archive/2010/08/04/10046103.aspx)
  3. Use Windows Azure Storage

Windows Azure Storage seemed to be the more cost effective version as the site does not currently use SQL Azure and they have purchased a subscription for Azure which includes both transaction and storage costs.

There is a sample asp.net session provider that uses Windows Azure Table Storage as its backing store. The sample can be downloaded from MSDN at

http://code.msdn.microsoft.com/windowsazure/Windows-Azure-ASPNET-03d5dc14

How to use the Azure Storage Session State Provider

Add the following Session State provider config to the web.config file of the project

   1: <!-- SessionState Provider Configuration -->
   2: <sessionState mode="Custom"
   3:               customProvider="TableStorageSessionStateProvider">
   4:   <providers>
   5:     <clear/>
   6:     <add name="TableStorageSessionStateProvider" type="Microsoft.Samples.ServiceHosting.AspProviders.TableStorageSessionStateProvider"/>
   7:   </providers>
   8: </sessionState>

Add your windows azure storage connection string (DataConnectionString) to each web role that requires session state (Not setting this will result in an object reference not set to an instance of an object exception)

Add a reference to the ASPProviders.dll taken from the sample project and make sure that the Copy Local property is set to true (Not setting this will cause an unable to load exception)

image

We also added a reference to System.DataServices.Client and set copy local to true on this too.(Not sure if this is needed)

Once this is setup and running, add multiple instances to your role configuration and run in the debugger. Make sure you can navigate to the page that has the session data in. I put a break point onto the action of the page and added a watch for Microsoft.WindowsAzure.ServiceRuntime.RoleInstance.CurrentRoleInstance.Id and checked to see if it changed and if it did change checked to see if the session data was visible.

You may well get the following error when you are using session as all the objects that are put into the Azure Table Storage session object need to be serializable.

Unable to serialize the session state. In 'StateServer' and 'SQLServer' mode, ASP.NET will serialize the session state objects, and as a result non-serializable objects or MarshalByRef objects are not permitted. The same restriction applies if similar serialization is done by the custom session state store in 'Custom' mode.

You can check to see the session data in the Azure Storage Server Explorer.

image

We are going to run this for a while to se how well it works and also see what debris is left behind in the table and blob storage due to ended sessions. We might have to have a job running that tidies up the expired sessions later.