Red Hat introduced Red Hat Enterprise Linux 9.1and Red Hat Enterprise Linux 8.7.
Jellyfish announced the launch of Jellyfish Benchmarks, a way to add context around engineering metrics and performance by introducing a method for comparison.
Using the in-app benchmarking capability, Jellyfish customers can now understand how they stack up against peers on a percentile basis across numerous metrics on the organization, division, group or team level.
Despite having more access than ever to engineering data, the complex nature of modern software engineering means that strategic decisions can lack proper context. Engineering teams may be performing well historically, but without a threshold to compare against — one that represents the very best engineering team across different industries — engineering leaders lack insight into whether their decisions are enabling their organization to achieve elite performance. Jellyfish Benchmarks solve this problem by providing customers with means for comparison from anonymized cross-industry datasets.
"At Jellyfish, our longstanding goal has been to support customers in building the highest-performing engineering teams possible," said Andrew Lau, CEO of Jellyfish. "Today, we achieved yet another milestone by offering engineering leaders a trustworthy and expansive set of benchmarks derived from Jellyfish's unrivaled customer base, providing the most comprehensive thresholds to measure against in the engineering industry."
Engineering teams who opt in will have their data anonymized and added to the benchmarking Jellyfish customer pool. For every metric defined as important to an engineering organization, customers can access a visualization comparing their performance on a percentile basis for a given metric in comparison to their peers or against their own organization — truly an industry first.
Among the key metrics being announced with Jellyfish Benchmarks are those focused on allocation, delivery, productivity, collaboration - including key metrics within the DevOps methodology. Allocation benchmarks allow teams to understand how their resource investments compare to other engineering organizations. Delivery can be benchmarked using metrics such as cycle time, productivity using metrics like issues resolved, and collaboration using metrics such as PR reviews. New benchmarking capabilities also enable DevOps teams to track against the DORA metrics - including deployment frequency, lead time, mean time to recovery, and change failure rate.
"By offering the industry's only in-app benchmarking, Jellyfish is empowering engineering leaders to use context driven by data to examine the efficacy of their engineering strategy and their team's operations," said Krishna Kannan, head of product at Jellyfish. "Jellyfish Benchmarks is available for every metric tracked across teams or the entire organization, helping customers understand how their company compares for the metrics that matter most to their engineering function."
Jellyfish Benchmarks also provide data-driven insights with which to inform and influence non-engineering executives, and thus drive better strategic decisions and more closely align engineering with the wider business. The announcement of Jellyfish Benchmarks enhances the Jellyfish EMP's current capabilities to measure engineering alignment against strategic business priorities and the performance and operations of engineering teams.