Outcomes, Not Outputs: Making Measurement Meaningful
February 08, 2024

Dave Laribee
Nerd/Noir

During British colonial rule in India, authorities in the snake-ridden city of Delhi offered rewards for dead cobras. The offer backfired when people recognized the incentive as a business opportunity and started breeding cobras, resulting in overpopulation when the scheme ended. This came to be known as the "Cobra Effect."

I've seen a similar thing happen when leaders set core metrics to evaluate individuals and product and engineering teams. Metrics get abused when they become a target, as Goodhart's Law states.

Or they're opaque and arbitrary because developers aren't involved in setting the measurement strategy. Or they're irrelevant, better suited to describing manufacturing work (launch new model by end of the year) than software development (setting a target for the number of commits).

No matter the issue, metrics are rarely the best way to understand how the work developers are doing drives the business forward.

Of course, teams need some way to quantify progress. So how can leaders measure and understand how well individuals and teams are working?

Outcomes are a better way to gauge success than outputs. Outcomes shift the focus away from arbitrary numbers like lines of code and toward the real impact on customers and industries.

How Metrics Fall Short

Contrary to what McKinsey consultants claim, we have plenty of well-established metrics for software development.

Most of our established metrics measure productivity (how much a team or individual produces in a given timeframe) or performance (the degree of skill with which tasks are completed). We can use velocity, deployment frequency, or any number of other metrics to illustrate what our teams are doing.

But these metrics don't help us understand what's going wrong when the numbers aren't what we expect. Metrics don't tell us if certain code is difficult to work with or if clunky release processes impact productivity.

As the Cobra Effect demonstrates, metrics can also create perverse incentives. Developers may not be breeding snakes to earn a bounty, but they may game metrics if pressured to meet them, such as by taking on only simple tasks to pad velocity.

This is especially true if metrics are employed solely for the benefit of leadership. Individuals are expected to carry out a strategy with no insight into its development and no avenue to provide feedback.

Worst of all, metrics don't actually tell us whether developer work matters. A team might execute an activity repeatedly with proficiency, even excellence — but if that activity isn't relevant to organizational goals, then what's the point?

Work to Achieve Outcomes, Not to Juice Metrics

When I'm helping a large-scale software development organization transform how they work, one of my main goals is to shift their focus from metrics to outcomes. Leading with outcomes is a more effective way to assess performance and productivity.

An outcome is a result that will happen — not the lines of code a team will produce, but rather the effect that code will have on users or businesses.

Outcomes help developers understand their work in terms of value. Too often, developers don't understand exactly who they're building for or why. Outcomes contribute to a stronger mental model of a customer or user and show how technology is expected to drive business results.

We're not suggesting throwing out metrics altogether. Instead, we're using outcomes to determine which metrics we should look at based on what we're trying to achieve. Then we use those metrics — which may not be the same for every team in the organization — to gauge progress.

How to Start Measuring in Terms of Outcomes

Outcomes often begin as a hunch. We take a page from the product manager's discovery playbook to get from gut feeling to validated outcome: we ask questions, gather data, and sift through insights to make sure that our outcome is relevant and valuable. The ideal outcome is small and achievable, yielding a path toward larger ones.

For example, say an engineering manager thinks her team could deliver value at a more predictable pace. She validates her hunch with data from project tracking software, which shows a wide variance in story size.

Then she uses a developer experience platform to dig deeper into drivers that influence productivity and assess her team against industry benchmarks.

She discovers that her team has low satisfaction scores on requirements quality and batch size, indicating that they're struggling to release early and often. She scopes her original outcome to something more concrete and immediate: "We work on small stories to ensure a consistent pace of delivery."

Now she can work backward with her team to find the contextual metrics that indicate progress toward those outcomes — like decreased average story size and increased iteration completion percentage.

Once the initial outcome is achieved, the team can shift to a new outcome — which likely means new metrics. This iterative process facilitates continuous improvement.

A Consensus-Driven Approach to Delivering Real Value

Notice that in the example above, the engineering manager works with her team to develop their strategy for measuring productivity. The shift from metrics to outcomes isn't just about pointing a team toward results that meaningfully help users or businesses. It's about creating a culture of transparency by involving developers in the process of defining what success means for individuals and the team.

That includes educating developers on what outcomes are and why they matter, as well as actively soliciting and responding to feedback. Done right, this shift — from metrics to outcomes, and from top-down mandate to bottom-up empowerment — will give developers a new level of agency in innovating and solving problems.

When developers have the opportunity, resources, and motivation to improve, everyone wins: teams, leaders, and the people and companies who will eventually use our products.

Dave Laribee is Co-Founder and CEO of Nerd/Noir
Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.