What Our Industry Is Getting Wrong About DORA Metrics
July 25, 2023

Ori Keren
LinearB

As engineering leaders, we've all become familiar with DORA metrics: deployment frequency, lead time for changes, change failure rate and mean time to restore. Many of us have looked to these metrics as how we measure the health and success of our engineering orgs.

In fact, our industry has started to view success through the lens of DORA metrics. That view is incomplete and, worse, often misunderstood. For a complete view on how to view DORA metrics and use them to improve engineering teams, we need to acknowledge some long-held misinterpretations.

3 Major Shortcomings for How Dora Is Being Used

1. Business leaders don't always understand how DORA metrics translate to business outcomes

If you've ever mentioned cycle time or deployment frequency to a non-technical leader, you'll immediately remember the blank stare you likely received in return. That's because these are still engineering metrics and non-engineering leaders don't necessarily understand — or care to understand — how the release of a tiny block of work to production helps you reach your monthly or quarterly business goals.

When we use DORA metrics, we aren't speaking about business outcomes. Business leaders want to know how quickly you're able to deliver a full feature, and they want reassurance that shipped features are meeting customer demands at the promised time.

2. Engineering improvement doesn't start with a deck

Having been a developer, I can confidently say developers tend to be interested in one thing: doing their job effectively without idle time in the middle.

When developers are expected to step out of their workflow to make connections between their work and business outcomes (as in when they're handed a deck tracking metrics like revenue and customer churn), it can be extremely frustrating.

We won't see true improvement in engineering organizations until we can quickly make those connections between developer experience and business success.

3. We're overlooking leading indicators of success

DORA metrics are fantastic at measuring lagging indicators of success. As an industry focused on these metrics, though, we're completely overlooking leading indicators — like the review process and pull-request size — that can help us make our teams and workflows quicker.

Bridging the Metrics Gap Between Engineering and Business Outcomes

While it may be difficult to directly connect engineering metrics to business metrics like revenue, there are a few metrics that can act as a bridge between engineering and business teams. These are metrics that focus on the speed at which you deliver new features and your ability to deliver on promises.

Although DORA metrics do not track follow-through or the speed at which new products ship, these metrics can reveal plenty about your engineering team's effectiveness. Here's how:

1. Investment Carvings

Investment Carvings show how much effort is going into a set of self-explanatory, predefined categories of resource deployment. These investment categories include things developers work on like Keeping the Lights On, Developer Experience, Feature Enhancements, and New Capabilities. They can help you understand where your team is compared to industry benchmarks and navigate the organization according to your needs.

2. Project Allocation

Project Allocation shows you what percent of your team is working on each project. This metric is critical to non-engineering leaders because it demonstrates that the team is prioritizing business-critical projects.

Meanwhile, it's critical to you because it provides the ability to say "no" or "not now" to projects of lower priority if your team's time is already spoken for. On top of that, it gives you ammo to ask for additional headcount if you spot gaps in labor allocation.

3. Project Planning Accuracy

Project Planning Accuracy speaks to how often you're able to deliver on promises. This metric is critical to business leaders because it proves they can count on your team to deliver on customer needs. It's critical to you because it shows you where you may be over-promising and under-delivering.

This metric is also one that works in harmony with DORA metrics — with deployment frequency, cycle time, change failure rate and mean time to restore — acting as leading indicators of Project Planning Accuracy.

Make Metrics Matter

As you begin measuring a wider breadth of metrics and bridging the gap between engineering success and the business at large, you'll quickly see engineering improvement, less hostility between engineering and other business functions, and ultimately happier customers and happier developers. Our industry has been using DORA metrics wrong.

Ori Keren is CEO and Co-Founder of LinearB
Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.