Value Stream Thinking
March 22, 2021

Andrew Davis
Copado

Beyond the Org Chart

Value stream thinking is an essential overlay on the traditional org chart. Org charts are an important structure for control and the flow of information through organizations. Org charts are also an example of the hierarchical structures found across human and animal communities.

Because our survival depends on being part of a group, we fear being rejected and crave being indispensable to the group. The most indispensable members of a group are those that others look to for information and example. Those who influence the group are de facto leaders. Thus people naturally crave to become influencers or leaders, and human groups naturally aggregate into hierarchical networks around such leaders.

Our instinctive need to belong to a group leads us to reduce the reasons we might be rejected. This leads to conformity, a fear of sticking out. People in an organization gradually learn to think, speak, and act similar to one another. This creates a bond within the organization, "organizational culture," but can also cause "groupthink" where people fail to accurately observe, express, or correct problematic ideas or actions.

The org chart is almost always the main "map" to understand an organization. Yet there are many ways to map a system, and each provides a distinct lens and insight. To paraphrase George Box, "all maps are wrong, but some are useful."

Org charts speak only to the internal structure of organizations. They don't express that organization's relationship with the outside world such as customers and partners. They also don't express the organization's most important internal relationships: Those through which work gets done.

The function of an org chart is to depict a system for belonging and control. The function of an org chart is not to provide a model for performance optimization. Yet every year, most organizations go through some type of global reorg, ostensibly for the purpose of increasing performance. Such reorgs disrupt relationships and projects within the company and with external customers. McKinsey reported that 23% of reorgs are deemed unsuccessful in retrospect.

Mapping the Organization in a New Way

Imagine mapping an organization in terms of "how work gets done." This requires that you first understand "what work are we doing?"

We can look at this from different levels of detail. From the company's mission statement down to the daily activities of an individual employee, we can identify different "whats" and "hows."

A map is a stable, visual model that orients viewers to particular features of a physical or conceptual space. A value stream map is a map that depicts the sequence of processes required to create value in an organization. Physical maps allow you to overlay measurements such as distance, elevation, and direction. Value stream maps allow you to overlay measurements such as time, quality, and the amount of work in progress.

Most of today's work is knowledge work, which is inherently invisible. This makes value stream mapping in knowledge work organizations even more important than in industries like manufacturing that deal in tangible goods. Yet we are early in the journey of mapping our organizational value streams.

Creating and maintaining maps always requires investment in time, energy, and communication. Surveying crews usually need multiple people to observe a terrain from different viewpoints. Similarly, value stream mapping benefits from bringing together distinct perspectives, ideally one or two people from each role involved in a process.

Orienting Around the Customer

Rather than orienting relationships around others in the internal organization, value stream maps orient relationships around the customer. The customer is under no obligation to conform to the organization; rather the organization needs to conform to the needs and expectations of the customer. This learning process allows for the team to optimize their processes to provide increasing value to the customer.

This optimization process is called value stream management: managing the end-to-end value stream to optimize for time, quality, throughput, and the happiness of contributors. Value stream management has been a recurring theme in business journals since the 1990s. Its application to IT has been pushed to the fore only recently by the DevOps movement. The unification of Development and IT Operations into a coherent and collaborative workstream can be seen as a specific instance of value stream thinking.

Value stream thinking has implications that extend beyond Dev and Ops, and can address silos, conflicts, and inefficiencies in any process, including marketing, sales, and finance.

Building the Map

The first step in value stream management is to map the entire value stream by showing the sequence of steps (including parallel processes) required to deliver value. Value stream maps can be used to improve velocity, quality, or both. If the team's focused on velocity, they gather at least rough metrics on the amount of time work spends in progress at each stage, the time spent waiting between each stage, and the total time it takes for work to travel end to end. This alone can help identify the main areas to focus on. Automating a 20 minute manual task down to 1 minute is not likely to help if that task is followed by a 2 day waiting period.

One powerful way to increase velocity is to improve quality. Work that has to go through long periods of inspection or be repeatedly sent back will naturally delay the process. Quality metrics on each stage of a value stream map are framed as % Complete and Accurate (%C/A). This is an estimate of how much work has to be sent back after each phase.

Another way to increase velocity is to reduce the amount of work in progress. Multi-tasking is one of the most reliable ways to make everything take longer. Large volumes of work in progress may also point to the need to bring more people in to help in a particular process.

The goal is to gain accurate enough understanding of the flow and bottlenecks to correctly target the first most important area for improvement. Typically in any value stream every aspect of the process could potentially be improved. The key is to prioritize strictly, and then to apply efforts to remedy that bottleneck or limitation. Then check whether the improvements were effective.

The purpose of the value stream is to deliver a specific product or service. But the act of improving the value stream also takes work. This means some aspect of a team's capacity is redirected back into improving the way they work. Typically most organizations are so focused on what they are delivering that they don't take the time to invest in how they're delivering. Value stream management means putting energy into looking at how you're working and approaching improvements systematically. Investing in increasing the organization's overall capacity is the wisest investment.

Andrew Davis is Senior Director of Product Marketing at Copado
Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.