App Developers Waste Time Debugging Errors Discovered in Production
November 28, 2016

Mohit Bhatnagar
ClusterHQ

Almost half (43 percent) of app developers spend between 10 and 25 percent of their time debugging application errors discovered in production, rather than developing new features, according to ClusterHQ's first Application Testing survey.

The top three challenges reported by survey respondents were:

■ An inability to fully recreate production environments in testing (33%)

■ Interdependence on external systems that makes integration testing difficult (27%)

■ Testing against unrealistic data before moving into production (26%)

With more organizations transitioning to microservices-based applications, it is apparent that legacy testing processes are no longer sufficient to meet the speed and agility requirements of modern businesses.

“Our Application Testing survey highlights that legacy software development practices, like relying on narrow subsets of synthetic data for testing, no longer cut it for teams focused on maximizing the amount of time they spend building features that users value,” said Mark Davis, CEO, ClusterHQ. “Forward looking software developer leaders understand that to deliver innovation to customers they must effectively manage the entire application lifecycle across a diverse range of infrastructures, a process that begins with identifying and eliminating bugs as early as possible so that teams can focus on adding end-user value.”

Errors found in production are costly

Respondents were asked to identity the environment in which they encounter bugs that are most costly to fix. Unsurprisingly, the majority (62%) selected production as the most expensive stage of app development to fix errors, followed by development (18%), staging (7%), QA (7%) and testing (6%).

Next, the survey took a deeper look at how often bugs are being encountered in production as a result of incomplete testing. Findings revealed:

■ Every day: 11%
■ Two to three times per week: 13%
■ Once per week: 12%
■ Every other week: 15%
■ Once a month: 25%
■ Don’t know: 21%
■ Never: 3%

A quarter of respondents report encountering bugs discovered in production one or more times per week. Developing new features should be priority number one in agile software development, however when asked what percentage of development time is spent debugging application errors discovered in production instead of working on new features, results showed:

■ Less than 10%: 34%
■ Between 10-25%: 43%
■ Between 25-50%: 18%
■ Between 50-75%: 4%
■ More than 75%: 1%

Causes of bugs in production

Testing is a crucial part of an application’s lifecycle, but it’s inherently challenging to ensure that tests done in development will mirror what happens in production. Survey takers were asked to select the top challenge associated with testing that causes bugs to appear in production. They responded:

■ Inability to fully recreate production environments in testing: 33%
■ Interdependence on external systems, making integration testing difficult: 27%
■ Testing against unrealistic data before moving into production: 26%
■ Difficulty sharing test data across different teams: 10%
■ Difficulty creating staging environments for testing: 4%

Recreating production environments was cited as the leading cause of bugs appearing in production, supporting the notion that testing on a laptop is like “flying” in a flight simulator – the experience can be very different when you actually get up in the air. This challenge was followed closely by interdependence on external systems that makes integration testing cumbersome, which leads into the third most cited challenge: testing against unrealistic data. At present, data is difficult to move between all the places that it is needed, including in test infrastructure. As a result, unrealistic, mock data sets are often used to test applications. However, these unrealistic data sets cannot prepare applications for all real world variables, and thus cause serious, expensive, and time consuming issues down the line.

Testing apps against realistic data

A resounding number of respondents (88%) would like the ability to test with realistic data during application development. However, this is not as easy as it sounds, and there are real challenges that prevent this from being the norm. When asked to identify the biggest challenge to testing against realistic data sets, respondents reported:

■ Keeping test data up-to-date: 23%
■ Moving data to all the places it is needed for testing: 19.5%
■ Keeping track of different versions of data to be used for different purposes: 19.5%
■ Managing access controls to data: 18%
■ Making copies of production data: 14%
■ Storage costs: 6%

The full survey polled 386 respondents. Not all respondents answered every question. Results shown are based on the percent of people who answered each individual question. In this survey, 41 percent identified as DevOps team members, followed by developers (37%), other (8%), operations (7%), QA (5%) and security (2%). Respondents work for organizations that range in size from 1 to 100 employees (49%), 101-500 employees (20%), 501-2500 employees (10%), 2501-5000 employees (5%), 5001-10,000 employees (3%) and 10,000+ employees (13%).

Mohit Bhatnagar is VP Product at ClusterHQ.

Share this

Industry News

May 19, 2022

Jellyfish announced the launch of Jellyfish Benchmarks, a way to add context around engineering metrics and performance by introducing a method for comparison.

May 19, 2022

Solo.io announced the addition and integration of Cilium networking into its Gloo Mesh platform, providing a complete application-networking solution for companies’ cloud-native digital transformation efforts.

May 19, 2022

Aqua Security announced multiple updates to Aqua Trivy, making it a unified scanner for cloud native security.

May 18, 2022

Red Hat unveiled updates across its portfolio of developer tools designed to help organizations build and deliver applications faster and more consistently across Kubernetes-based hybrid and multicloud environments.

May 18, 2022

Armory announced public early access to their new Continuous Deployment-as-a-Service product.

May 18, 2022

DataCore Software announced DataCore Bolt, enterprise-grade container-native storage software for DevOps.

May 17, 2022

DevOps Institute, a global professional association for advancing the human elements of DevOps, announced the release of the Upskilling IT 2022 report.

May 17, 2022

Replicated announced a host of new platform features and capabilities that enable their customers to accelerate enterprise adoption of their Kubernetes applications.

May 17, 2022

Codefresh announced that its flagship continuous delivery (CD) platform will be made accessible as a fully-hosted solution for DevOps teams seeking to quickly and easily achieve frictionless, GitOps-based continuous software delivery in the cloud.

May 16, 2022

Red Hat announced new capabilities and enhancements across its portfolio of open hybrid cloud solutions aimed at accelerating enterprise adoption of edge compute architectures through the Red Hat Edge initiative.

May 16, 2022

D2iQ announced a partnership with GitLab.

May 16, 2022

Kasten by Veeam announced the new Kasten by Veeam K10 V5.0 Kubernetes data management platform.

May 12, 2022

Red Hat introduced Red Hat Enterprise Linux 9, the Linux operating system designed to drive more consistent innovation across the open hybrid cloud, from bare metal servers to cloud providers and the farthest edge of enterprise networks.

May 12, 2022

Couchbase announced version 7.1 of Couchbase Server.

May 12, 2022

Copado added Copado Robotic Testing to Copado Essentials.