Yesterday I upgraded our production 2012.2 TFS server to update 3. All seemed to go OK and it completed with no errors, it was so much easier now that the update supports the use of SQL 2012 Availability Groups within the update process, no need to remove the DBs from the availability group prior to the update.
However, though there were no errors it did reported a warning, and on a quick check users could not connects to the upgraded server on our usually https URL.
On checking the update log I saw
[Warning@09:06:13.578] TF401145: The Team Foundation Server web application was previously configured with one or more bindings that have ports that are currently unavailable. See the log for detailed information.
[Info @09:06:13.578] +-+-+-+-+-| The following previously configured ports are not currently available... |+-+-+-+-+-
[Info @09:06:13.584] 1 - Protocol : https
[Info @09:06:13.584] - Host : tfs.blackmarble.co.uk
[Info @09:06:13.584] - Port : 443
[Info @09:06:13.584] port: 443
[Info @09:06:13.585] authMode: Windows
[Info @09:06:13.585] authenticationProvider: Ntlm
The issue appears if you use host headers, as we do for our HTTPS bindings. The TFS configuration tool does not understand these, so sees more than one binding in our case on 443 (our TFS server VM also hosts as a nuget server on https 443, we use host headers to separate the traffic) . As the tool does not know what to do with host headers, it just deletes the bindings it does no understand.
Anyway the fix was to manually reconfigured the HTTPS bindings in IIS and all was OK.
On checking with Microsoft it seems this is a know issue, and on their radar to sort out in future.
After last nights release of new TFS and Visual Studio bits at the Build conference I spent this morning upgrading my demo VMs. Firstly I upgraded to TFS 2012.3 and then snapshotting before going onto 2013 Preview. So by changing snapshot I can now demo either version. In both cases the upgrade process was as expected, basically a rerun of the configuration wizard with all the fields bar the password prefilled. Martin Hinshelwood has done a nice post if you want more details on the process
Looking at the session at Build on Channel9 there are not too many on TFS, to find out more about the new features then you are probably better to check out the TechEd USA or TechEd Europe streams.
Just wasted a bit of time trying to find the build tab on a TFS Team Project hosted on the hosted http://tfs.visualstudio.com using a Git repository. I was looking on team explorer expecting to see something like
But all I was seeing the the Visual Studio Git Changes option (just the top bit on the left panel above).
It took to me ages to realise that the issue was I had cloned the Git repository to my local PC using the Visual Studio Tools for Git. So I was just using just Git tools, not TFS tools. As far as Visual Studio was concerned this was just some Git repository it could have been local, GitHub, TFS Service or anything that hosts Git.
To see the full features of TFS Service you need to connect to the service using Team Explorer (the green bits), not just as a Git client (the red bits)
Of course if you only need Git based source code management tools, just clone the repository and use the Git tooling, where inside or outside Visual Studio. The Git repository in TFS is just a standard Git repro so all tools should work. From the server end TFS does not care what client you use, in fact it will still associate you commits, irrespective of client, with TFS work items if you use the #1234 syntax for work item IDs in your comments.
However if you are using hosted TFS from Visual Studio, it probably makes more sense to use a Team Explorer connection so all the other TFS feature light up, such as build. The best bit is that all the Git tools are still there as Visual Studio knows it is still just a Git repository. Maybe doing this will be less confusing when I come to try to use a TFS feature!
An interesting change with Lab Management 2012 and SCVMM 2012 is that templates become a lot less useful. In the SCVMM 2008 versions you had a choice when you stored VMs in the SCVMM library. …
- You could store a fully configured VM
- or a generalised template.
When you added the template to a new environment you could enter details such as the machine name, domain to join and product key etc. If you try this with SCVMM 2012 you just see the message ‘These properties cannot be edited from Microsoft Test Manager’
So you are meant to use SCVMM to manage everything about the templates, not great if you want to do everything from MTM. However, is that the only solution?
An alternative is to store a SYSPREP’d VM as a Virtual Machine in the SCVMM library. This VM can be added as many times as is required to an environment (though if added more than once you are asked if you are sure)
This method does however bring problems of its own. When the environment is started, assuming it is network isolated, the second network adaptor is added as expected. However, as there is no agent on the VM it cannot be configured, usually for a template Lab Management would sort all this out, but because the VM is SYSPREP’d it is left sitting at the mini setup ‘Pick your region’ screen.
You need to manually configure the VM. So the best process I have found is
- Create the environment with you standard VMs and the SYSPRED’d one
- Boot the environment, the standard ready to use VMs get configured OK
- Manually connect to the SYSPREP’d VM and complete the mini setup. You will now have a PC on a workgroup
- The PC will have two network adapters, neither connected to you corporate network, both are connected to the network isolated virtual LAN. You have a choice
Either way you need to manually install the Test Agent and run the configuration (just select the defaults it should know where the test controller is). This will configure network isolated adaptor to the 192.168.23.x network Now you can manually join the isolated domain A reboot the VM (or the environment) and all should be OK
- Connect the legacy adaptor to your corporate LAN, to get at a network share via SCVMM
- Mount the TFS Test Agent ISO
All a bit long winded, but does mean it is easier to build generalised VMs from MTM without having to play around in SCVMM too much.
I think all would be a good deal easier of the VM had the agents on it before the SYSPREP, I have not tried this yet, but that is true in my option of all VMs used for Lab Management. Get the agents on early as you can, just speeds everything up.
Martin Hinshelwood did a recent post on moving source code between TFS servers using git tf. He mentioned that you could use the --deep option to get the whole changeset check-in history.
Being fairly new to using Git, in anything other than the simplest scenarios, it took me a while to get the commands right. This is what I used in the end (using the Brian Keller VM for sample data) …
C:\tmp\git> git tf clone http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/Main oldserver --deep
Connecting to TFS...
Cloning $/fabrikamfiber/Main into C:\Tmp\git\oldserver: 100%, done.
Cloned 5 changesets. Cloned last changeset 24 as 8b00d7d
C:\tmp\git> git init newserver
Initialized empty Git repository in C:/tmp/git/newserver/.git/
C:\tmp\git> cd newserver
C:\tmp\git\newserver [master]> git pull ..\oldserver --depth=100000000
remote: Counting objects: 372, done.
remote: Compressing objects: 100% (350/350), done.
96% (358/372), 2.09 MiB | 4.14 MiB/s
Receiving objects: 100% (372/372), 2.19 MiB | 4.14 MiB/s, done.
Resolving deltas: 100% (110/110), done.
* branch HEAD -> FETCH_HEAD
C:\tmp\git\newserver [master]> git tf configure http://vsalm:8080/tfs/fabrikamfibercollection $/fabrikamfiber/NewLocation
C:\tmp\git\newserver [master]> git tf checkin --deep --autosquash
Connecting to TFS...
Checking in to $/fabrikamfiber/NewLocation: 100%, done.
Checked in 5 changesets, HEAD is changeset 30
The key was I had missed the –autosquash option on the final checkin.
Once this was run I could see my checking history, the process is quick and once you have the right command line straight forward. However, just like TFS Integration Platform time is compressed, and unlike TFS Integration Platform you also lose the ownership of the original edits.
This all said, another useful tool in the migration arsenal.
I have been doing some editing of the standard scrum TFS 2012 Sprint Burndown report in SQL 2012 Report Builder. When I ran the report after editing the MDX query in the dsBurndown DataSet to return an extra column I got an error:
- on a remote PC it just said error with dsBurndown dataset
- on the server hosting reporting services, or in Report Builder, I got a bit more information, it said the TaskName parameter was not defined.
On checking the state of the dataset parameters before and after my edit I could see that the TaskName parameter had been lost
Manually re-adding it fixed the problem.
Interestingly which parameters were lost seemed to depend on the MDX query edit I made, I assume something is inferring the parameters from the MDX query.
So certainly one to keep an eye on. I suspect this is a feature of Report Builder, maybe I am better just using trusty Notepad to edit the .RDL file. Oh how I love to edit XML in Notepad
Today at TechEd USA Brian Harry announced Visual Studio 2013, have a look at his blog for details of the new ALM features. These include…
- Agile Portfolio Management
- Git source control on premises
- Revised team explorer including pop out windows
- Improvements in code editing and annotation
- Improvement in web based test management
- Team Room – chat like collaboration
- Cloud based web load testing
- The start of addition of release management to TFS via the purchase of InRelease
For more info see the various sessions up on Channel 9
Thanks to everyone who attended my session on ‘TFS for Developers’ at the Grey Matter’s ‘Building Applications for the Future’ event today. As you will have noticed my session was basically slide free, so not much to share there.
As I said at the end of my session to find out more have a look at
Also a couple of people asked by about TFS and Eclipse, which I only mentioned briefly at the end. For more on Team Explorer Everywhere look at the video I did last year on that very subject
A key requirement for any DevOps strategy is the reporting on how your solution is behaving in the wild. PreEmptive Analytics™ for Team Foundation Server (TFS) can provide a great insight in this area, and there is a good chance you are already licensed for it as part of MSDN.
So why not have a look on the UK MSDN site for more details the free Microsoft hosted event.
MSDN Webinar Improve Software Quality, User Experience and Developer Productivity with Real Time Analytics
Tuesday, May 28 2013: 4:00 – 5:00 pm (UK Time)
Also why not sign up for Black Marble’s webinar event in June on DevOps process and tools in the Microsoft space.
If you look at the labels tab in the source control history in Visual Studio 2012 you could be confused by the changeset numbers. How can all the labels added by my different builds, done over many days, be associated with the same changeset?
If you look at the same view in VS 2010 the problem is not so obvious, but that is basically due to the column not being shown.
The answer is that the value shown in the first screen shot is for the root element associated with the label. If you drill into the label you can see all the labelled folders and files with their changeset values when the label was created
So it is just that the initial screen is confusing drilling in makes it all clearer.