I have just release a new Azure DevOps Pipelines extension to update a page in a Git based WIKI.
It has been tested again
- Azure DevOps WIKI – running as the build agent (so the same Team Project)
- Azure DevOps WIKI – using provided credentials (so any Team Project)
- GitHub – using provided credentials
It takes a string (markdown) input and writes it to a new page, or updates it if it already exists. It is designed to be used with my Generate Release Notes Extension, but you will no doubt find other uses
Whilst listening to a recent Radio TFS it was mentioned that TFS Aggregator uses the C# SOAP based Azure DevOps APIs; hence needed a major re-write as these APIs are being deprecated.
Did you know that there was a REST API alternative to TFS Aggregator?
My Azure DevOps Services & Server Alerts DSL is out there, and has been for a while, but I don’t think used by many people. It aims to do the same as TFS Aggregator, but is based around Python scripting.
However, I do have to say it is more limited in flexibility as it has only been developed for my (and a few of my clients needs), but its an alternative that is based on the REST APIs.
Scripts are of the following form, this one changes the state of a work item if all it children are done
# Expect 2 args the event type and a value unique ID for the wi
if sys.argv == "workitem.updated" :
wi = GetWorkItem(int(sys.argv))
parentwi = GetParentWorkItem(wi)
if parentwi == None:
LogInfoMessage("Work item '" + str(wi.id) + "' has no parent")
LogInfoMessage("Work item '" + str(wi.id) + "' has parent '" + str(parentwi.id) + "'")
results = [c for c in GetChildWorkItems(parentwi) if c["fields"]["System.State"] != "Done"]
if len(results) == 0 :
LogInfoMessage("All child work items are 'Done'")
parentwi["fields"]["System.State"] = "Done"
msg = "Work item '" + str(parentwi.id) + "' has been set as 'Done' as all its child work items are done"
SendEmail("firstname.lastname@example.org","Work item '" + str(parentwi.id) + "' has been updated", msg)
LogInfoMessage("Not all child work items are 'Done'")
LogErrorMessage("Was not expecting to get here")
I have recently done a fairly major update to the project. The key changes are:
- Rename of project, repo, and namespaces to reflect Azure DevOps (the namespace change is a breaking change for existing users)
- The scripts that are run can now be
- A fixed file name for the web instance running the service
- Based on the event type sent to the service
- Use the subscription ID, thus allowing many scripts (new)
- A single instance of the web site running the events processor can now handle calls from many Azure DevOps instances.
- Improved installation process on Azure (well at least tried to make the documentation clearer and sort out a couple of MSDeploy issues)
Full details are on the project can be seen on the solutions WIKI, maybe you will find it of use. Let me know if the documentation is good enough
There is a general move in Azure DevOps Pipelines to using YAML, as opposed to the designer, to define your pipelines. This is particularly enforced when using them via the new GitHub Marketplace Azure Pipelines method where YAML appears to be the only option.
This has shown up a hole in my Pipeline Tasks documentation, I had nothing on YAML!
So I have added a YAML usage page for each set of tasks in each of my extensions e.g the file utilities tasks.
Now, as are most developers, I am lazy. I was not going to type all that information. So I wrote a script to generate the markdown from respective task.json files in the repo. Now this script will need some work for others to use as it relies on some special handling due to quirks of my directory structure, but I hope it will be of use to others.
Azure DevOps has had some serious issue over the past couple of weeks with availability here in Europe.
A really good open and detailed root cause analysis has just been posted by the Azure DevOps team at Microsoft. It also covers the mitigations they are putting place to make sure this same issues do not occur again.
We all have to remember that the cloud is not magic. Cloud service providers will have problems like any on-premise services; but trying to hide them does nothing to build confidence. So I for one applaud posts like this. I just wish all cloud service providers were as open when problem occur.
It may have passed you by, it had me as I had not created a PAT for a while, but managing custom security for PATs in Azure DevOps is much easier since Sprint 140.
You now get some help to pick the correct ‘limited’ rights set by the simple grouping of rights.
We just need some more detailed documentation on what each option actually maps to permissions wise now to complete the picture
When I started creating OSS extensions for Azure DevOps Pipelines (starting on TFSPreview, then VSO, then VSTS and now named Azure DevOps) I made the mistake of putting all my extensions in a single GitHub repo. I thought this would make life easier, I was wrong, it should have been a repo per extension.
I have considered splitting the GitHub repo, but as a number of people have forked it, over 100 at the last count, I did not want to start a chain of chaos for loads of people.
This initial choice has meant that until very recently I could not use the Pull Request triggers in Azure DevOps Pipelines against my GitHub repo. This was because all builds associated with the repo triggered on any extension PR. So, I had to trigger builds manually, providing the branch name by hand. A bit of a pain, and prone to error.
I am pleased to say that with the roll out of Sprint 140 we now get the option to add a path filter to PR triggers on builds linked to GitHub repo; something we have had for Azure DevOps hosted Git repos since Sprint 126.
So now my release process is improved. If I add a path filter as shown below, my build and hence release process trigger on a PR just as I need.
It is just a shame that the GitHub PR only checks the build, not the whole release, before saying all is OK. Hope we see linking to complete Azure DevOps Pipelines in the future.
Registration for the new season of Black Marble events have just been opened. If you can make it to Yorkshire why not come to an event (or two)
If you are stuck in the grim south, why not look out for us at Future Decoded in London at the end of the month
Whilst I was off work last week TFS 2018 Update 3 was released. As stated in the 2018.3 release notes this is the final bug fix update release of TFS 2018.
The next major release of TFS will not be named TFS 2019 as you might have expected, but will use the new name of Azure DevOps Server. You can see the features planned for this next release in the Azure DevOps Features Timeline
The Azure DevOps (VSTS) team have published the promised postmortem on the outage on the 4th of September.
It gives good detail on what actually happened to the South Central Azure Datacenter and how it effected VSTS (as it was then called).
More interestingly it provides a discussion of mitigations they plan to put in place to stop a single datacentre failure having such a serious effect in the future.
Great openness as always from the team
Today Microsoft made a big announcement, VSTS is now Azure DevOps.
The big change is they have split VSTS into 5 services you can use together or independently, including Azure Pipelines for CI/CD – free for open source and available in the GitHub CI marketplace.
An important thing to note is that IT IS NOT JUST FOR AZURE.
Don’t be afraid of the name. There a wide range of connectors to other cloud providers such as AWS and Google Cloud, as will as many other DevOps tools
Learn more at have a look at the official post