Successful Software Delivery with DevOps

With DevOps best practices and Microsoft’s DevOps tooling, Black Marble can deliver agile planning, source code control, package management, build, testing and release automation to continuously integrate, test, deliver and monitor your application.

It is crucial to not only have the right people in place for your cloud adoption journey, but also to use the right processes and the right tools. A typical DevOps approach consists of cross-functional teams provisioning their own infrastructure, with high degrees of automation using templates, codified rules for security controls and cloud-native architecture.

This is where the core aspects of continuous value delivery meet the demands currently driving companies; an integrated team approach including enterprise agile and cloud computing.

Successful Software Delivery with DevOps
Successful Software Delivery with DevOps

Delivering an Enterprise Cloud Operating Model

There have been some major paradigm shifts in the history of computing with some of the most notable being marked, not only by changes in technology, but by changes in staffing that technology. When the computing standard for mainframe shifted to client/server, the staff model moved from computer operator to system administrator.

The same is true with a move to the cloud.

The cloud fundamentally changes how businesses procure and use technology resources. Traditionally, having had ownership and responsibility of all aspects of technology from infrastructure to software with the cloud, this allows businesses to provision and to consume resources only as needed. Moving to the cloud can bring increased business, agility, and significant costs benefits.

However, the journey to the cloud needs to be managed carefully at each stage; not just for delivery but for expectations and ROI. Even more significantly, the cloud opens up access to a range of on-demand cloud services, unavailable just 10 years previously. These include hyper-scaling, AI services and computing power; short-term consumption providing significant benefits.

All these services combined, provide business realisations that only the cloud can offer.

Transforming your business into a cloud-business is more than simply moving your systems and infrastructure into the cloud – your organisation needs a Cloud Operating Model (COM) to adopt a cloud-first mentality. It is important to guide your people away from traditional IT thinking, to ensure they realise business benefits and harness the true potential of the cloud, where adoption drives innovation. This white paper will cover how this can be achieved with the assistance of Black Marble.

For more information on Delivering an Enterprise Cloud Operating Model, get in touch for a copy of the white paper I put together with our CCO, Rik Hepworth.

Cover of Delivering an Enterprise Cloud Operating Model White Paper
Delivering an Enterprise Cloud Operating Model White Paper, 2nd Edition.

Logic App Flat File Schemas and BizTalk Flat File Schemas

I recently started working on a new Logic App for a customer using the Enterprise Integration Pack for Visual Studio 2015, I was greeted with a familiar sight, the schema generation tools from BizTalk, but with a new lick of paint Open-mouthed smile

The Logic App requires the use of Flat File schemas, so I knocked up a schema from the  instance I’d been provided and went to validate it against the instance used to generate it (since it should validate).

My Flat File was a bit of a pain to be frank in that it had ragged endings, that is to say some sample rows might look a bit like:

1,2,3,4,5,6

1,2,3,4,5

1,2,3,4,5,6

Which I’ve worked with before….but couldn’t quite remember how I solved exactly other than tinkering with the element properties.

I generated the schema against the lines with the additional column and erroneously set the last field property Nillable to True. When I went to validate the instance lo’ and behold it wasn’t a valid instance and I had little information about why.

So I fired up my BizTalk 2013 R2 virtual machine (I could have used my 2016 one to be fair if I hadn’t sent it to the farm with old yellow last week) and rinsed and repeated the Flat File Schema Wizard.

So I got a bit more information this time, namely that the sample I’d been provided was missing a CR/LF on the final line and that the Nillable I’d set on the last column was throwing a wobbler by messing up the following lines.

Setting the field’s  Nillable property back to false, but it’s Min and Max occurs to 0 and 1 respectively and I had a valid working schema.

So I copied the schema back to my Logic Apps VM and attempted to revalidate my file (with its final line CR/LF amended). To my annoyance, invalid instance!

I was boggled at this quite frankly but some poking around the internet led me to this fellow’s Blog.

https://blogs.msdn.microsoft.com/david_burgs_blog/2018/03/26/generate-and-validate-flat-file-native-instances-from-flat-file-schemas/

In short there’s an attribute added on a Flat File schema which denotes the extension class to be used by the schema editor, when built by a BizTalk Flat File Schema Wizard it’s set to

Microsoft.BizTalk.FlatFileExtension.FlatFileExtension

When generated by the Enterprise Integration Pack it’s

Microsoft.Azure.Integration.DesignTools.FlatFileExtension.FlatFileExtension

Changing this attribute value in my BizTalk generated Flat File schema and presto, the schema could validate the instance it was generated from.

So in short I’ll say the flavour of the Schema designer tools in the Enterprise Integration Pack seem to throw out errors and verbosity a little different to its BizTalk ancestor, it still throws out mostly the same information, but in different places:

  • EIP

You only get a generic error message in the output log. Go and examine the errors log for more information.

image

  • BizTalk

You get the errors in the output log, and in the error log.

image

In my case the combination of my two errors (the flat file being malformed slightly, and my Nillable field change) in the EIP only gave me an error “Root element is missing” which wasn’t particularly helpful and the BizTalk tooling did give me a better fault diagnosis.

On the bright side the two are more or less interchangeable. Something to bear in mind if you’re struggling with a Flat Schema and have a BizTalk development environment on hand.

Postmortem published by the Microsoft VSTS Team on last week’s Azure outage

The Azure DevOps (VSTS) team have published the promised postmortem on the outage on the 4th of September.

It gives good detail on what actually happened to the South Central Azure Datacenter and how it effected VSTS (as it was then called).

More interestingly it provides a discussion of mitigations they plan to put in place to stop a single datacentre failure having such a serious effect in the future.

Great openness as always from the team

Versioning your ARM templates within a VSTS CI/CD pipeline

Updated 3 Feb 2018Also see Versioning your ARM templates within a VSTS CI/CD pipeline with Semantic Versioning

Azure Resource Templates (ARM) allow your DevOps infrastructure deployments to be treated as ‘content as code’. So infrastructure definitions can be stored in source control.

As with any code it is really useful to know which version you have out in production. Now a CI/CD process and its usage logs can help here, but just having a version string stored somewhere accessible on the production systems is always useful.

In an ARM Template this can be achieved using the ‘content version’ field in the template (see documentation for more detail on this file). The question becomes how best to update this field with a version number?

The solution I used was a VSTS JSON Versioning Task I had already created to update the template’s .JSON definition file. I popped this task at the start of my ARM templates CI build process and it set the value prior to the storage of the template as a build artifact used within the CD pipeline

image