Automating adding issues to Beta GitHub Projects using GitHub Actions

The new GitHub Issues Beta is a big step forward in project management over what was previously possible with the old ‘simple’ form of Issues. The Beta adds many great features such as:

  • Project Boards/Lists
  • Actionable Tasks
  • Custom Fields including Iterations
  • Automation

However, one thing that is not available out the box is a means to automatically add newly created issues to a project.

Looking at the automations available within a project you might initially think that there is a workflow to do this job, but no.

The ‘item added to project’ workflow triggers when the issue, or PR, is added to the project not when it is created. Now, this might change when custom workflows are available in the future, but not at present.

However, all is not lost. We can use GitHub Actions to do the job. In fact, the beta documentation even gives a sample to do just this job. But, I hit a problem.

The sample shows adding PRs to a project on their creation, but it assumes you are using GitHub Enterprise, as they make use of the ‘organization’ object to find the target project.

The problem is the ‘organization’ object was not available to me as I was using a GitHub Pro account (but it would be the same for anyone using free account).

So below is a reworked sample that adds issues to a project when no organization is available. Instead, making use of the ‘user’ object which also exposes the ProjectNext method

Making SonarQube Quality Checks a required PR check on Azure DevOps

This is another of those posts to remind me in the future. I searched the documentation for this answer for ages and found nothing, eventually getting the solution by asking on the SonarQube Forum

When you link SonarQube into an Azure DevOps pipeline that is used from branch protection the success, or failure, of the PR branch analysis is shown as an optional PR Check

The question was ‘how to do I make it a required check?’. Turns out the answer is to add an extra Azure DevOps branch policey status check for the ‘SonarQube/quality gate’

When you press the + (add) button it turns out the ‘SonarQube/quality gate’ is available in the drop-down

Once this change was made, the SonarQube Quality Check becomes a required PR Check.

My cancer story – thus far

This is a somewhat different post to my usual technical ones…

In December 2017 I had major surgery. This was to remove an adrenal cortical carcinoma (ACC) that had grown on one of my adrenal glands and then up my inferior vena cava (IVC) into my heart.

Early on I decided, though not hiding the fact I was ill, to not live every detail on social media. So, it is only now that I am back to a reasonable level of health and with some distance that I feel I can write about my experiences. I hope they might give people some hope that there can be a good outcome when there is a cancer diagnosis.

I had known I was ill for a good while before I was diagnosed in May 2017. I had seen my Parkrun times slowing week on week to the point where I could not run at all, and I had also had a couple of failed blood donations due to low haemoglobin levels.

It was clear I was unwell, and getting worse, but there was no obvious root cause. All sorts of things had been considered from heart to thyroid. Cancer was suspected, but a tumour could not be found. Try as they might, my GP had failed to find a test that showed anything other than my blood numbers were not right. I was just continuing to get weaker, by that spring I was unable to walk more than a few hundred meters without getting out of breath with my heart beating at well over 170 BPM.

The problem was that ACC is a rare form of cancer and mine had presented in a hard to find way. There are two basic forms of ACC. One shuts down your adrenal system, and you notice this very quickly. The other form shows no symptoms until the tumour starts to physically impact something. This was the form I had. In my case, the tumour was increasingly blocking blood flow in my IVC and heart.

In the end, the tumour was found because of a lower abdominal ultrasound. By the time I had the ultrasound scan it was about the only diagnostic that had not been tried. It was a strange mixture of shock and relief to be immediately told after the scan by the sonographer that ‘the doctor would like a word before you go home’. So, at least I knew the cause of why I felt so ill. I left the hospital that day with a diagnosis of an adrenal tumour that was most likely benign but may be malignant, on blood thinning injections and with a whole set of appointments to find out just how bad it was.

At this point the NHS did what it does best, react to a crisis. Over the next couple of weeks, I seemed to live at the regional cancer centre at St James Hospital in Leeds having all sorts of tests.

My health, and the time I was spending at the hospital, meant there was no way I could continue to work. I was lucky I was able to transition quickly onto long term sick in such a way that meant I did not have the financial worries many cancer patients have to contend with on top of their illness. I would not be seeing work again for over 9 months.

The next phase of diagnostic tests were wide ranging. Plenty of blood was taken, I had to collect my urine for 48 hours, there were CT scans and PET Scans, all to get a clearer idea of how bad it was. The real clincher test as to whether the tumour was benign or malignant was a biopsy. One of those strangely pain free tests, due to the local anaesthetics, but accompanied by much poking, pushing and strange crunching noises. Then a 6 hour wait flat on my back on a recovery ward before I could sit up, let alone go home.

It was whilst laying down post-test I had probably my best meal on the NHS. Having just missed the lunch service on the recovery ward, a good move from past experience, a nurse produced a huge pile of toast and jam. A perfect meal for the reclined patient.

It was also during this post test recovery time that I first met other cancer patients and had a chance to have a proper chat with them. No matter how bad your case seems to be you always seem to be meeting people with a worse prognosis. Whilst on the biopsy recovery ward I met a man who told me his story. A check-up because he did not feel well led to the discovery of a large brain tumour which then spread throughout his body. He knew he only had a short time left. The conversation opened my eyes to the reality of my and other patients’ situations.

A couple of weeks later we got the bad news that the cancer was malignant and very advanced. We had clung onto the hope it was benign. The news was delivered in a very matter of fact way, that I probably would not see Christmas unless a treatment plan could be found, and the options were not good. There were tears.

However, there was at least some good news, the tumour was a single mass, it had not spread around my body. The problem was that there was no obvious surgical option due to its size and position. All that could be done was to start chemotherapy to see if the tumour could be shrunk. So, a very ‘old school’, and hence harsh, three cycle course of chemotherapy was started in July 2017.

I dealt with all of this in a very step by step way. People seemed surprised by this, that I was not more emotionally in pieces. I assume that is just my nature. I think this whole phase of my illness was much harder on my partner and family. They had to watch me getting more ill with no obvious route to recovery. For me it was just a case of get up and doing whatever the tasks were for the day. Whether they be tests, treatments or putting things in place like a Lasting Power of Attorney.

Life became a cycle of three-day eight-hour blocks of chemotherapy, then a month to try to recover. On each cycle I recovered less than the previous one.

The chemotherapy ward is strangely like flying business class. The seats look comfortable, but after eight hours they are not. You can’t go to the toilet without issues, on an airplane it is getting out of the row, on the chemotherapy ward it is taking the drip with you. In both cases, the toilet is too small. You feel tired all the time, just like jet lag, and of course, the food is questionable at best.

As I had seen on other wards, there was a strong camaraderie on the chemotherapy ward. Everyone is going through life changing treatment. Some people looked very ill, others as if there is nothing obviously wrong with them, but irrespective of their condition I found the patients, as well as the staff, very supportive. It was far from an unhappy place. Not something I had expected.

In many ways the worst side effect of chemotherapy, beyond the expected weight loss, hair loss, nausea and lack of energy was that my attention span disappeared. For the first time in my adult life I stopped reading. I struggled to make it through a single paragraph without forgetting where I was. I remember one afternoon in a hospital waiting room, whilst waiting for yet more test results, trying to read a page in a novel. I never got to the end of the page, just starting it over and over. It was also at this time I realised I had to stop driving, I felt my attention was too poor and my reactions too slow.

As I said, by this point I was very weak. This made most day-to-day activities very hard, but the strange thing was I found I could still swim. I had had the theory that though my IVC was blocked, hence not bringing blood from the lower half of my body, if I swam with a pull-buoy just using my arms, I would be OK. This turned out to be correct, much to the surprise of the medical professionals. So, I started to do some easy swimming in the recovery phases between chemotherapy cycles when I was able. It turned out the biggest issue was I got cold quickly due to my weight loss. So, swim sessions were limited to 15 to 20 minutes and just a few hundred metres.

After the planned three chemotherapy cycles all the tests were rerun and it was found that the tumour seemed unaffected. It was always a very low chance of success. I had already decided I was unlikely to start a 4th cycle as I felt so ill, it was just no life. I did not want any more chemotherapy when the chance of success was so low. Better to have some quality of life before the end.

This is where I got lucky because I was being treated at a major cancer research centre. I had been told there was no adrenal cancer surgical option for the way my ACC had presented. However, the hospital’s renal cancer surgical team had seen something similar and were willing to operate with the support of the cardiac and vascular teams. A veritable who’s who of senior surgeons at St James as I was informed by the nurse when I was being admitted for the operation in December 2017.

My operation meant stopping the heart, removing the tumour along with an adrenal gland, and a kidney (collateral damage as there was nothing wrong with it other than its proximity to the tumour) and then patching me all back together. Over 10 hours on the operating table and a transfusion of a couple of pints of blood.

When you see a very similar version of your operation on the BBC series on cutting edge surgery ‘Edge of Life’ you realise how lucky you are. Just a few years ago or living in another city and the operation would not have been possible.

Given my heart had to be stopped, I was treated as a cardiac patient, and the cardiac department moves you through recovery fast. Most of the people on the ward were having heart bypasses, so I was ‘the interestingly different’ case to many of the staff. I did take longer than the usual 5 days on the ward taken by bypass patients, but I still managed to get out of hospital in 10 days, in time for Christmas. It is surprising how fast you can get over being opened up from the top of your chest to your groin, and how little pain there was.

At this point I was in theory cured, the tumour was removed, blood was flowing again but I was very weak and recovery was going to be a long road. I started with walks of only a few minutes and then the rest of the day resting. The great news was that I could walk again without getting out of breath and my heart rate going through the roof.

So, over the next few months, I gradually regained my health, some weight, some hair and my attention span. I was able to ease back into work part time in the early summer of 2018.
However, the surgery was not the end of my treatment. The surgeons were confident they had got all the tumour they could see. They said it was well defined, so cancerous and normal tissue could be differentiated, but there was always the chance of microscopic cancerous cells remaining. So, I was put on two years of Mitotane tablet-based chemotherapy. This was the treatment with the best evidence base, but that is not saying much. There are not that many research studies into ACC treatment options as it is so rare. My treatment plan was based on a small Italian and German study of 177 people, most of which did not complete the plan, but it did show a statistically significant reduction in the chance of remission after 5 years.

Mitotane stops cell division and I had not realised how hard this would make my recovery and specifically regaining some fitness. I was OK for day to day living, but an activity like running was not possible. I twice started Couch to 5K but had to give up as I could not progress beyond the walking stages.

The mental weight of everything did not catch up with me until a good year or so after surgery, by which time I was back at work and living a ‘normal’ life. Previously people had kept asking ‘how are you doing?’. As I said, I felt they expected me to be in pieces, and I was just going step by step. It is only when the main treatment stopped and life returned to normal that everything that had occurred hit me. A seemingly unrelated fairly small in the scheme of things family incident caused it all to come flooding back and completely stopped me in my tracks.

It was that this time I reached out to the support services of the Macmillan charity and specifically the Robert Ogden Centre at St James for help. This was something I had not done prior to this time, though my partner had used their family support services earlier in my treatment. With their counselling help, I worked my way through my problems and got back to some form of normal.

In the autumn of 2019 I came off Mitotane and once it was out of my system I could at last try to get fit again. So, it was back to Couch to 5K and with a few repeated weeks I was able to run 5K again. I was back running Parkrun in November 2019. It was great to get back to my local Roundhay Parkrun community, though I had been volunteering whenever my health allowed throughout my illness. I was running much slower than before I was ill, but running.

Since then, I have to say Covid lockdown has helped me, giving me a structure to my training. I have certainly got a reasonable level of endurance back, but any speed seems to elude me.

I have always had a fairly high maximum heart rate, over 200 well into my 40s, and before getting cancer it was still in the 190s. Now, post illness, I struggle to reach 160 and my bike and run maximum heart rates are very similar. I have tried to do a maximum heart rate test, it is as if I get to a heart rate around 150-160 for a tempo run, but it barely goes any higher when I sprint. So, I have a question for anyone with experience of training after cancer and heart surgery. Is it expected after stopping the heart that my maximum heart rate should be way lower? Or is the problem my hormone levels are different due to the lack of one of my adrenal glands? Or is it just I am getting older and have just lost muscle mass? I am not sure I will ever know the answer to that one, it is not exactly a question the NHS is set up to answer. All their post-operative guidance is aimed at day-to-day levels of exertion not the elevated levels caused by sports.

But that is a minor gripe, I am reasonably fit again. I have recently completed my first triathlon in 5 years and between lockdowns walked the 268 miles of the Pennine Way with my partner. I am not as fast as I was, but I am 5 years older and have had major heart surgery. Hell, I am alive.

Like all cancer patients, this is not the end of the road for my treatment. I am still on steroids and have annual CT scans, but all the signs seem good that the surgery got the tumour and there is no reason I should not live to a ripe old age.

I would not have got here without the support of my partner and family, and the unbelievable work of the NHS and the support services I have used. I can’t thank you all enough.

Leeds Hospital Charity – the charity of Leeds Teaching Hospitals
Macmillan Cancer Support – support or cancer patients and their families
NHS Blood Transfusion Service – please consider giving blood, without regular donations surgery like mine is not possible.

But what if I can’t use GitHub Codespaces? Welcome to github.dev

Yesterday GitHub released Codespaces as a commercial offering. A new feature I have been using during its beta phase.

Codespaces provides a means for developers to easily edit GitHub hosted repos in Visual Studio Code on a high-performance VM.

No longer does the new developer on the team have to spend ages getting their local device setup ‘just right’. They can, in a couple of clicks, provision a Codespace that is preconfigured for the exact needs of the project i.e the correct VM performance, the right VS Code extensions and the debug environment configured. All billed on a pay as you go basis and accessible from any client.

It could be a game-changer for many development scenarios.

However, there is one major issue. Codespaces, at launch, are only available on GitHub Teams or Enterprise subscriptions. They are not available on Individual accounts, as yet.

But all is not lost, hidden within the documentation, but widely tweeted about is the github.dev editor. You can think of this as Codespace Lite i.e. it is completely browser-based so there is no backing VM resource.

To use this feature, alter your URL https://github.com/myname/myrepo to https://github.dev/myname/myrepo . Or when browsing the repo just press the . (period) and you swap into a browser-hosted version of VS Code.

You can install a good number of extensions, just as long as they don’t require external compute resources.

So, this is a great tool for any quick edit that requires multiple files to be touch in the same commit.

I think it is going to be interesting to see how github.dev and Codespaces are used. Maybe we will see the end of massive developer PCs?

Or will that have to wait until the Codespace VM available offer GPUs?

How I dealt with a strange problem with PSRepositories and dotnet NuGet sources

Background

We regularly re-build our Azure DevOps private agents using Packer and Lability, as I have posted about before.

Since the latest re-build, we have seen all sorts of problems. All related to pulling packages and tools from NuGet based repositories. Problems we have never seen with any previous generation of our agents.

The Issue

The issue turned out to be related to registering a private PowerShell repository.

$RegisterSplat = @{
Name = 'PrivateRepo'
SourceLocation = 'https://psgallery.mydomain.co.uk/nuget/PowerShell'
PublishLocation = 'https://psgallery.mydomain.co.uk/nuget/PowerShell'
InstallationPolicy = 'Trusted'
}

Register-PSRepository @RegisterSplat

Running this command caused the default dotnet NuGet repository to be unregistered i.e. the command dotnet nuget list source was expected to return

Registered Sources:
  1.  PrivateRepo
      https://psgallery.mydomain.co.uk/nuget/Nuget
  2.  nuget.org [Enabled]
      https://www.nuget.org/api/v2/
  3.  Microsoft Visual Studio Offline Packages [Enabled]
      C:Program Files (x86)Microsoft SDKsNuGetPackages

But it returned

Registered Sources:
  1.  PrivateRepo
      https://psgallery.mydomain.co.uk/nuget/Nuget
  2.  Microsoft Visual Studio Offline Packages [Enabled]
      C:Program Files (x86)Microsoft SDKsNuGetPackages

The Workaround

You can’t call this a solution, as I cannot see why it is really needed, but the following command does fix the problem

 dotnet nuget add source https://api.nuget.org/v3/index.json -n nuget.org

Porting my Visual Studio Parameters.xml Generator tool to Visual Studio 2022 Preview

As I am sure you are all aware the preview of Visual Studio 2022 has just dropped, so it is time for me to update my Parameter.xml Generator Tool to support this new version of Visual Studio.

But what does my extension do?

As the Marketplace description says…

A tool to generate parameters.xml files for MSdeploy from the existing web.config file or from an app.config file for use with your own bespoke configuration transformation system.

Once the VSIX package is installed, to use right-click on a web.config, or app.config, file in Solution Explorer and the parameters.xml file will be generated using the current web.config entries from for both configuration/applicationSettings and configuration/AppSettings. The values attributes will contain TAG style entries suitable for replacement at deployment time.

If the parameters.xml already exists in the folder (even if it is not a file in the project) you will be prompted before it is overwritten.

Currently the version in the Marketplace of Parameter.xml Generator Tool supports Visual Studio 2015, 2017 & 2019

Adding Visual Studio 2022 Support

The process to add 2022 support is more complicated than adding past new versions, where all that was usually required was an update to the manifest. This is due to the move to 64Bit.

Luckily the process is fairly well documented, but of course I still had a few problems.

MSB4062: The “CompareBuildTaskVersion” task could not be loaded from the assembly

When I tried build the existing solution, without any changes, in Visual Studio 2022 I got the error

MSB4062: The “CompareBuildTaskVersion” task could not be loaded from the assembly D:myprojectpackagesMicrosoft.VSSDK.BuildTools.15.8.3253toolsVSSDKMicrosoft.VisualStudio.Sdk.BuildTasks.15.0.dll. Could not load file or assembly.

This was fixed by updating the package Microsoft.VSSDK.BuildTools from 15.1.192 to 16.9.1050.

Modernizing the Existing VSIX project

I did not modernize the existing VSIX project before I started the migration. When I clicked the Migrate packages.config to PackageReference…. it said my project was not a suitable version. So I just moved to the next step.

Adding Link Files

After creating the shared code project, that contains the bulk of the files, I needed to add links to some of the resources i.e. the license file, the package icon and .VSCT file.

When I tried add the link, I got an error in the form

 Cannot add another link for the same file in another project

I tried exiting Visual Studio, cleaning the solution, nothing helped. The solution was to edit the .CSPROJ file manually in a text editor e.g.

 <ItemGroup>
    <Content Include="ResourcesLicense.txt">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    <Content Include="..ParametersXmlAddinSharedResourcesPackage.ico">
      <Link>Package.ico</Link>
      <IncludeInVSIX>true</IncludeInVSIX>
    </Content>
    <Content Include="ResourcesPackage.ico">
      <CopyToOutputDirectory>Always</CopyToOutputDirectory>
    <Content Include="..ParametersXmlAddinSharedResourcesLicense.txt">
      <Link>License.txt</Link>
      <IncludeInVSIX>true</IncludeInVSIX>
    </Content>
    <EmbeddedResource Include="ResourcesParametersUppercaseTransform.xslt" />
    <VSCTCompile Include="..ParametersXmlAddinSharedParametersXmlAddin.vsct">
      <Link>ParametersXmlAddin.vsct</Link>
      <ResourceName>Menus.ctmenu</ResourceName>
    </VSCTCompile>
  </ItemGroup>

Publishing the new Extension

Once I had completed the migration steps, I had a pair of VSIX files. The previously existing one that supported Visual Studio 2015, 2017 & 2019 and the new Visual Studio 2022 version.

The migration notes say that in the future we will be able to upload both VSIX files to a single Marketplace entry and the Marketplace will sort out delivering the correct version.

Unfortunately, that feature is not available at present. So for now the new Visual Studio 2022 VSIX is published separately from the old one with a preview flag.

As soon as I can, I will merge the new VSIX into the old Marketpalce entry and removed the preview 2022 version of the VSIX

Automating the creation of Team Projects in Azure DevOps

Creating a new project in Azure DevOps with your desired process template is straightforward. However, it is only the start of the job for most administrators. They will commonly want to set up other configuration settings such as branch protection rules, default pipelines etc. before giving the team access to the project. All this administration can be very time consuming and of course prone to human error.

To make this process easier, quicker and more consistent I have developed a process to automated all of this work. It uses a mixture of the following:

A sample team project that contains a Git repo containing the base code I want in my new Team Project’s default Git repo. In my case this includes

  • An empty Azure Resource Management (ARM) template
  • A .NET Core Hello World console app with an associated .NET Core Unit Test project
  • A YAML pipeline to build and test the above items, as well as generating release notes into the Team Project WIKI

A PowerShell script that uses both az devops and the Azure DevOps REST API to

  • Create a new Team Project
  • Import the sample project Git repo into the new Team Project
  • Create a WIKI in the new Team Project
  • Add a SonarQube/SonarCloud Service Endpoint
  • Update the YAML file for the pipeline to point to the newly created project resources
  • Update the branch protection rules
  • Grant access privaledges as needed for service accounts

The script is far from perfect, it could do much more, but for me, it does the core requirements I need.

You could of course enhance it as required, removing features you don’t need and adding code to do jobs such as adding any standard Work Items you require at the start of a project. Or altering the contents of the sample repo to be cloned to better match your most common project needs.

You can find the PowerShell script in AzureDevOpsPowershell GitHub repo, hope you find it useful.

Getting the approver for release to an environment within an Azure DevOps Multi-Stage YAML pipeline

I recently had the need to get the email address of the approver of a deployment to an environment from within a multi-stage YAML pipeline. Turns out it was not as easy as I might have hoped given the available documented APIs.

Background

My YAML pipeline included a manual approval to allow deployment to a given environment. Within the stage protected by the approval, I needed the approver’s details, specifically their email address.

I managed to achieve this but had to use undocumented API calls. These were discovered by looking at Azure DevOps UI operations using development tools within my browser.

The Solution

The process was as follows

  • Make a call to the build’s timeline to get the current stage’s GUID – this is documented API call
  • Make a call to the Contribution/HierarchyQuery API to get the approver details. This is the undocumented API call.

The code to do this is as shown below. It makes use of predefined variables to pass in the details of the current run and stage.

Note that I had to re-create the web client object between each API call. If I did not do this I got a 400 Bad Request on the second API call – it took me ages to figure this out!

Loading drivers for cross-browser testing with Selenium

Another post so I don’t forget how I fixed a problem….

I have been making sure some Selenium UX tests that were originally written against Chrome also work with other browsers. I have had a few problems, the browser under test failing to load or Selenium not being able to find elements.

Turns out the solution is to just use the custom driver start-up options, the default constructors don’t seem to work for browsers other theran Chrome and Firefox.

Hence, I not have helper method that creates a driver from me based the a configuration parameter

  internal static IWebDriver GetWebDriver()
    {
        var driverName = GetWebConfigSetting("webdriver");
        switch (driverName)
        {
            case "Chrome":
                return new ChromeDriver();
            case "Firefox":
                return new FirefoxDriver();
            case "IE":
                InternetExplorerOptions caps = new InternetExplorerOptions();
                caps.IgnoreZoomLevel = true;
                caps.EnableNativeEvents = false;
                caps.IntroduceInstabilityByIgnoringProtectedModeSettings = true;
                caps.EnablePersistentHover = true;
                return new InternetExplorerDriver(caps);
            case "Edge-Chromium":
                var service = EdgeDriverService.CreateDefaultService(Directory.GetCurrentDirectory(), "msedgedriver.exe");
                return new EdgeDriver(service);
            default:
                throw new ConfigurationErrorsException($"{driverName} is not a known Selenium WebDriver");
        }
    }

A first look at the beta of GitHub Issue Forms

Update 10 May 2021 Remember that GitHub Issue Forms are in early beta, you need to keep an eye on the regular new releases as they come out. For example, my GitHub Forms stopped showing last week. This was due to me using now deprecate lines in the YAML definition files. Once I edited the files to update to support YAML they all leap back into life


GitHub Issues are core to tracking work in GitHub. Their flexibility is their biggest advantage and disadvantage. As a maintainer of a projects, I always need specific information when an issue is raised. Whether it be a bug, or feature request.

Historically, I have used Issue Templates, but these templates are not enforced. They add a suggestion for the issue text, but this can be ignored by the person raising the issue, and I can assure you they often do.

I have been lucky enough to have a look at GitHub Issue Forms, which is currently in early private beta. This new feature aims to address the problem by making the creation of issues form-based using YML templates.

I have swapped to using them on my most active repos Azure DevOps Pipeline extensions and GitHub Release Notes Action. My initial experience has been very good, the usual YML issue of incorrect indenting, but nothing more serious. They allow the easy creation of rich forms that are specific to the project.

They next step is to see if the quality of the logged issues improves.