Whilst seting up a Release Management 2015.1 server we came across a strange problem. The installation appears to go OK. We were able to install the server and from the client created a simple vNext release pipeline and run it. However, the release stalled on the ‘Upload Components’ step.
Looking in event log of the VM running the Release Management server we could see many many errors all complaining about invalid XML, all in the general form
Message: Object reference not set to an instance of an object.: \r\n\r\n at Microsoft.TeamFoundation.Release.Data.Model.SystemSettings.LoadXml(Int32 id)
Note: The assembly that it complaining about varied, but all Release Management Deploayer related.
We tried a reinstall on a new server VM, but got the same results.
Turns out issue was due to the service account that the Release Management server was running as; this was the only thing common between the two server VM instances. We swapped to use ‘Network Server’ and everything lept into life. All we could assume was that some group policy or similar settings on the service account was placing some restriction on assembly or assembly config file loading.
You can use the filePath type in a vNext VSTS/TFS task as shown below
"label": "Settings File",
"helpMarkDown": "Path to single settings files to use (as opposed to files in project folders)",
to present a file picker dialog in the build editor that allows the build editor to pick a file or folder in the build’s source repository
While doing some task development recently I found that this control did not behave as I had expected
- If a value is explicitally set then the full local path to selected file or folder (on the build agent) is returned e.g. c:\agent\_work\3\s\yourfolder\yourfile.txt – just as expected
- If you do not set a value, or set a value then remove your setting when you edit a build, then you don’t get an empty string, as I had expected. You get the path to the BUILD_SOURCESDIRECTORY e.g. c:\agent\_work\3\s – makes sense when you think about it.
So, if as in my case, you wanted to have specific behaviour only when this values was set to something other than the repo root you need to add some guard code
if ($settingsFile -eq $Env:BUILD_SOURCESDIRECTORY )
$settingsFile = ""
Once I did this my task behaved as a needed, only running the code when the user had set an explicit value for the settings file.
I have previously posted on how a PowerShell script can be used to run StyleCop as part of vNext VSTS/TFS build. Now I have more experience with vNext tasks it seemed a good time to convert this PowerShell script into a true task that can deploy StyleCop and making it far easier to expose the various parameters StyleCop allows.
To this end I have written a new StyleCop task that can be found in my vNext Build Repo, this has been built to use the 18.104.22.168 release of StyleCop (so you don’t need to install StyleCop in the build machine, so it works well on VSTS).
To use this task:
- Clone the repo
- Build the tasks using Gulp
- Upload the task you require to your VSTS or TFS instance
Once this is done you can add the task to your build. You probably won’t need to set any parameters as long as you have settings.stylecop files to define your StyleCop ruleset in the same folders as your .CSPROJ files (or are happy default rulesets).
If you do want to set parameters your options are:
- TreatStyleCopViolationsErrorsAsWarnings - Treat StyleCop violations errors as warnings, if set to False any StyleCop violations will cause the build to fail (default false).
And on the advanced panel
- MaximumViolationCount - Maximum violations before analysis stops (default 1000)
- ShowOutput - Sets the flag so StyleCop scanner outputs progress to the console (default false)
- CacheResults - Cache analysis results for reuse (default false)
- ForceFullAnalysis - Force complete re-analysis (default true)
- AdditionalAddInPath - Path to any custom rule sets folder, the directory cannot be a sub directory of current directory at runtime as this is automatically scanned. This folder must contain your custom DLL and the Stylecop.dll and Stylecop.csharp.cs else you will get load errors
- SettingsFile - Path to single settings files to use for all analysis (as opposed to settings.stylecop files in project folders)
When you run the build with the new task you sould expect to see a summary of the StyleCop run on the right
Update 6 Feb 2016 - I have made some major changes to this task to expose more parameters, have a look at this post that details the newer version
Today a good way to pull together all your measures of code quality is to run SonarQube within your automated build; in a .NET world this can show changes in quality over time for tools such as FxCop (Code Analysis) and StyleCop. However sometime you might just want to run one of these tools alone as part of your automated build. For Code Analysis this is easy, it is built into Visual Studio just set it as a property on the project. For StyleCop it is a bit more awkward as StyleCop was not designed to be run from the command line.
To get around this limitation I wrote a command line wrapper that could be used within a build process, see my blog post for details of how this could be used with vNext build.
Well that was all best part of a year ago. Now I have more experience with vNext build it seems wrong to use just a PowerShell script when I could create a build task that also deploys StyleCop. I have eventually got around to writing the task which you can find in my vNextBuild repo.
Once the task is uploaded to your TFS for VSTS instance, the StyleCop task can be added into any build process. The task picks up the file locations from the build environment variables and then hunts for StyleCop settings files (as detailed in my previous post). The only argument that needs to be set is whether the buidl should fail if there are violations
Once this is all setup the build can be run and the violations will be shown in the build report, whether the build fails or passes is down to how you set the flag for the handling of violations
If you want to run CodeUI tests as part of a build you need to make sure the device running the test has access to the UI, for remote VMs this means having a logged in session open and the build/test agent running interactivally. Problem is what happens when you disconnect the session. UNless you manage it you will get the error
Automation engine is unable to playback the test because it is not able to interact with the desktop. This could happen if the computer running the test is locked or it’s remote session window is minimized
In the past I would use a standard TFS Lab Management Environment to manage this,you just check a box to say the VM/PC is running coded UI tests and it sorts out the rest. However, with the advent of vNext build and the move away from Lab Manager this seems overly complex.
It is not a perfect solution but this works
- Make sure the VM autologs in and starts your build/test agents in interactive mode (I used SysInternal AutoLogin to set this up)
- I connect to the session and make sure all is OK, but I then disconnect redirecting the session
- To get my session ID, at the command prompt, I use the command query user
- I then redirect the session tscon.exe RDP-Tcp#99 /dest:console, where RDP-Tcp#99 is my session ID
- Once I was disconnected my CodeUI test still run
I am sure I can get a slicker way to do this, but it does fix the immediate issue
This bit of Powershell code could be put in a shortcut on the desktop to do the job, you will want to run the script as administrator
$OutputVariable = (query user) | Out-String
$session = $OutputVariable.Substring($OutputVariable.IndexOf("rdp-tcp#")).Split(" ")
& tscon.exe $session /dest:console
We are currently working on updating a Windows 8 application to be a Windows 10 Universal application. This has caused a few problem on a TFS vNext automated build box. The revised solution builds fine of the developers box and fine on the build VM if opened in Visual Studio, but fails if built via the VSTS vNext build CI MSBuild process showing loads of references missing.
Turns out the issue was due to Nuget versions.
The problem was that as part of the upgrade the solution had gained some new projects. These used the new project.json file to manage their Nuget references, as opposed to the old packages.config file. Visual Studio 2015 handles these OK, hence the build always working in the IDE, but you need Nuget.exe 3.0 or later for it to handle the new format. The version of Nuget installed as part of my vNext build agent was 2.8. So no wonder it had a problem.
To test my assumptions I added a Nuget Installer task to my build and set an explicit path to the newest version of Nuget.
Once this was done my build was fine.
So my solution options are
- Don’t use the agents shipped with TFS 2015, get newer ones fro VSTS this has a current version of Nuget (just make sure your agent/server combination is supported. I have had issues with the latest VSTS agent and a TFS 2015 RTM instance)
- Manually replace the version of Nuget.exe in my build agent tools folder – easy to forget you did but works
- Place a copy of the version of Nuget.exe I want on each build VM and reference it s path explicitly (as I did to diagnose the problem)
The first option is the best choice as it is always a good plan to keep build agents up to date
Vijay in the Microsoft Release Management product group has provided a nice post on the various methods you can use to migrate from the earlier versions of Release management to the new one in Visual Studio Team Services.
An area where you will find the biggest change in technology is when moving from on premises agent based releases to the new VSTS PowerShell based system. To help in this process the ALM Rangers have produced a command line tool to migrate assets from RM server to VSTS, it exports all your activities as PowerShell scripts that are easy to re-use in a vNext Release Management process, or in Visual Studio Team Services’ Release tooling.
So if you are using agent based releases, why not have a look at the tools and see how easy it makes to migrate your process to newer tooling.
Many of the major new features of VSTS are delivered by the new marketplace and extension model, such as code search and package management. However, did you realise that this new way of adding functionality to VSTS is open to you too, not just to Microsoft?
To see what can be done why not have a look at the Visual Studio Team Services Extensions from the ALM Rangers
SonarQube released version 5.2 a couple of weeks ago. This enabled some new features that really help if you are working with MSbuild or just on a Windows platform in general. These are detailed in the posts
The new ability to manage users with LDAP is good, but one of the most important for me is the way 5.2 ease the configuration with SQL in integrated security mode. This is mentioned in the upgrade notes; basically it boils down to the fact you get better JDBC drivers with better support for SQL Clustering and security.
We found the upgrade process mostly straight forward
- Download SonarQube 5.2 and unzipped it
- Replaced the files on your SonarQube server
- Edited the sonar.properties file with the correct SQL connection details for an integrated security SQL connection. As we wanted to move to integrated security we did not need to set the sonar.jdbc.username setting.
Important: One thing was not that clear, if you want to use integrated security you do need the sqljdbc_auth.dll file in a folder on a search path (C:\windows\system32 is an obvious place to keep it). You can find this file on MSDN
- Once the server was restarted we ran the http://localhost:9000/setup command and it upgraded our DBs
And that was it, for the upgrade. We could then use the standard SonarQube upgrade features to upgrade our plug-ins and to add the new ones like the LDAP one.
Once the LDAP plug-in was in place (and the server restarted) we were automatically logged into SonarQube with our Windows AD accounts, so that was easy.
However we hit a problem with the new SonarQube 5.2 architecture and LDAP. The issue was that with 5.2 there is now no requirement for the upgraded 1.0.2 SonarQube MSBuild runner to talk directly to the SonarQube DB, all communication is via the SonarQube server. Obviously the user account that makes the call to the SonarQube server needs to be granted suitable rights. That is fairly obvious, the point we tripped up on was ‘who was the runner running as?’ I had assumed it was as the build agent account, but this was not the case. As the connection to SonarQube is a TFS managed service, it had its own security credentials. Prior to 5.2 these credentials (other than the SonarQube server URL) had not mattered as the SonarQube runner made it own direct connection to the SonarQube DB. Post 5.2, with no DB connection and LDAP in use, these service credentials become important. Once we had set these correctly, to a user with suitable rights, we were able to do new SonarQube analysis runs.
One other item of note. The change in architecture with 5.2 means that more work is being done on the server as opposed the client runner. The net effect of this is there is a short delay between runs being completed and the results appearing on the dashboard. Once expect it, it is not an issue, but a worry the first time.
Announced at Connect() today were a couple of new tools that could really help a team with their DevOps issues when working with VSTS and Azure (and potentially other scenarios too).
- DevTest Lab is a new set of tooling within the Azure portal that allows the easy management of Test VMs, their creation and management as well as providing a means to control how many VMs team members can create, thus controlling cost. Have a look at Chuck’s post on getting started with DevTest Labs
- To aid general deployment, have a look that new Release tooling now in public preview. This is based on the same agents as the vNext build system and can provide a great way to formalise your deployment process. Have a looks at Vijay’s post on getting started with the new Release tools