Digital Economy Dispatch #063 -- How to Balance Speed and Quality in Software Delivery

Digital Economy Dispatch #06321st November 2021

How to Balance Speed and Quality in Software Delivery

On one of my increasingly frequent visits to London, I was intrigued to see the guy across the carriage pull out a laptop sporting a familiar Linux sticker on the case. In a world dominated by Microsoft Windows, this rare occurrence always gets my attention. I was even more pleased to see that the first thing he did was to fire up his software development tools and started debugging some complex code.

Watching him surreptitiously throughout my hour-long journey, I noticed him bouncing across multiple windows while editing various parts of his software, running a few tests, then uploading it to cloud-based servers. Although I could not see clearly, it is quite possible that by the time we had reached Waterloo he had delivered software changes that were deployed and running anywhere in the world. All in a few clicks from his seat on the 7:40am train to London.

The Change Conundrum

I reflected on this with two completely opposed reactions.

The first is to marvel at the speed of delivery of software updates in today’s digital world. There have been many changes in software development practices across the past four decades since I started coding. One of the most important has been the drive to ease the path from making changes to the code through to its deployment into the hands of users. More and more of our activities rely on software to power the devices we use every day. As a result, with software eating the world, it is vital that the software supply chain is effective and efficient. Such agility in software delivery is essential to learn quickly, evolve, and move forward.

Yet, delivering software hasn’t always been easy. In earlier days of writing software for a major telecoms company in Liverpool in the 1980s, debugging the software running a large switching system involved writing the software by hand on coding sheets that were gathered each evening and flown to London to be typed into the central computer by a team dedicated to the task. Then, a printout was sent on the next plane north for review. If all went well, with a 3-day turnaround you would be scheduled for a test run of your changes. Heaven help you if there was fog at Heathrow!

Today the speed at which software changes can be delivered is astounding. In many of the systems we use the software is updated many times in a day, often without us knowing it is happening.

This points to my second reaction. It is hard not to wonder about the safety, security, and quality of the software being deployed today. Writing software from the warm seat of a train may be appropriate for some things, but surely not for all. Consider how you would feel if some of the software managing your bank account is updated this way. Or these changes are part of software running a Tesla car being upgraded with over-the-air updates hundreds of times a year.

Of course, much of the focus in software delivery is to ensure its quality. It is essential to use a range of management and support procedures governing software production, especially where failure has severe economic or human consequences. Even so, there are many concerns for the quality of today’s software. Organizations are struggling to control software delivery processes while addressing the demand for more software, additional features, and greater responsiveness to user needs. How can this balance be maintained?

DevOps and DORA

To respond to this challenge, a key focus in software development is to build the delivery process around a central core of DevOps practices. This alignment between Development and Operations is a recognition that delivering better software demands coordination between development, IT operations, quality engineering, and delivery management teams. They work together using an integrated set of processes, methods, and tools aimed at maximizing alignment and reducing the friction between their activities.

Several ideas are central to a DevOps approach. However, at its core is a focus on well managed practice of continuous development and continuous integration (CD/CI). Based around traditional practices and tools for software configuration management, teams collaborate to ensure changes are released in a controlled, systematic way to reduce risks from errors and omissions. This CD/CI foundation supports the integration of code changes from multiple contributors into a single software system allowing developers to merge code changes frequently into a central repository where builds and tests then run. Supporting this approach are automated tools that verify the code’s quality before integration.

Also emerging from this focus is a set of metrics that are used to benchmark DevOps practices. Created by the DevOps Research and Assessment (DORA) team (now part of Google), they have been continually examining the practices of over 30,000 engineering professionals over several years to understand the software delivery practices in use. Their annual State of DevOps report outlines their findings. They focus their efforts around 4 key metrics that characterize the maturity of an organization’s approach to DevOps:

  • Deployment Frequency—How often an organization successfully releases new software to production.

  • Lead Time for Changes—The amount of time it takes a committed change to get into the hands of users.

  • Change Failure Rate—The percentage of deployed changes that cause a failure in production.

  • Time to Restore Service—How long it takes an organization to recover from a failure in production.

These 4 metrics are significant because they highlight the volatility and misalignment that exists across the different software development and delivery capabilities of an organization. The extensive studies from the DORA team indicate that they are critical in ensuring a smooth transition across these areas to provide a balance between speed and quality in software delivery.

Summary

The current crisis has highlighted that accelerating the pace of change in new software is critical to support new opportunities and to adapt software to emerging usage models. However, the speed of change must be balanced with the demand for high quality in the software that is increasingly part of our lives.

The DevOps focus and DORA metrics are an important part of maintaining this balance. They highlight the integration and alignment that is required to maintain the pace of change across software development teams, while emphasizing the outcomes that drive quality in delivery. Something I hope is clear to the person sitting in the train furiously coding.

Digital Economy Tidbits

Our Relationships with Ai: Friend or Foe? Link.

A great deal of the success of any digital transformation has a lot to do with changing attitudes not pointing out workforce ineptitude. Something that is highlighted in this recent report on adoption of AI.

In the near future, AI will drive our cars, allocate public resources, screen job candidates, scan our faces and restock our fridges. Sometimes it already does these things. But as it’s use grows, so will regulation. While policy makers in the UK, Germany, France and the US are convinced about the need for new rules, there is no consensus on the approach they should take, or how effective regulation might be, according to a survey of 1000 tech policy experts carried out by YouGov on behalf of Clifford Chance and Milltown Partners.