Check Point® Software Technologies Ltd. announced its position as a leader in The Forrester Wave™: Enterprise Firewalls, Q4 2024 report.
In this age of software eating the world, implementation speed is of the essence. Organizations that cannot update their systems quickly and effectively will find themselves left behind by more nimble competitors and become easy prey for cybercriminals.
A key limiting factor in the deployment of software is that it often must reside in different locations and devices, and in some cases will be serving different use cases. This requires the creation of multiple configurations, which accompany the application code.
Too Many Trees in the Forest
From a development perspective, a separate asset for each configuration is cumbersome and inefficient because it requires the creation of multiple separate artifacts for each variation of configuration. This slows down the software release process which must be written, tested and verified for each such artifact.
A famous example of this problem was the firmware for HP's LaserJet line of products, which includes printers, copiers, scanners and other peripherals. With about two dozen different products to support, each with its own software and multiple versions, cycles began to slow down dramatically and development teams had to focus the vast majority of attention on integrating and testing code rather than developing new features. To top it off, development costs jumped on the order of 250% in one year.
Ultimately, the company solved the problem by decoupling version control of configuration data management from application code. This allowed the development team to create a single software release for all products and while allowing the much smaller configuration data to be different for each device.
This new way of development naturally requires some changes on the part of the developer. For one thing, configuration requires its own versioning place with separate deployment processes to release configuration changes. And because this is now decoupled from the application deployment process, it resides on a separate development and release cadence.
Having to change much smaller pieces of configuration makes the versioning process more efficient and less error prone, which in turn improves the cycle time on the backend by producing fewer bugs and releasing software at a faster pace. At the same time, it makes development more flexible and dynamic because the new software no longer has to be deployed to each device with each configuration change.
Consolidated Code
In HP's case, the results were dramatic. Build cycle times dropped from more than a week to less than three hours, resulting in the completion of 10 to 15 builds per day. Commits jumped from one per day to more than 100, and regression test cycle time was cut from six weeks to less than 24 hours. Most importantly, the time that developers spent developing new features grew from 5% to 40% of their total working hours.
In addition to creating a single codebase, HP instituted several other key measures that greatly improved its development performance. The first was the creation of self-testing builds using automated unit, acceptance and integration tests that could be continually run against the main code base. While this was a fairly heavy feat — six server racks holding 500 physical servers, each running four virtual machines — it allowed the team to cut upwards of 15,000 hours of testing per day and push much of the physical testing upstream where it is cheaper and easier to perform.
HP also created a fast feedback loop capable of alerting developers to bugs within a few hours rather than the typical day by running a full regression test for each new change. This was coupled with a new rule that brought the development cycle to a full stop whenever someone broke the build or the test suite, as there was zero allowance for broken builds to persist and interfere with new developments. This was supplemented along with a chat room that allowed any problem to be resolved quickly and the development pipeline to be restored at a higher quality than before. No one was allowed to leave work unless their build was still green, and no code commits were allowed on a red build.
Out with the Bad
Further, the team started using gated code commits to prevent bad code or bad tests from entering the main code base. This effectively isolated the main development process from potential harm so that instead of 200 programmers sitting idle due to bad code, it would only affect maybe 25. A gated commit allows new code to integrate with the main code base only if that change passes all of the testing and checks, not before.
Company-wide, this led to dramatic improvements in the quality of the LaserJet product line. Previously, the marketing team had basically given up asking on new features because they knew developers would only deliver on maybe two. It would also take six to 12 months to get it done, an eternity in such a competitive, highly dynamic market. Under the new system, everything is accelerated, from the development of new features to repairs of existing ones and even the elimination of tools that are no longer favored by users.
HP's experience can be replicated across a wide range of devices, literally anything that requires code. Point of sale solutions in particular would benefit greatly from a wide range of new services, such as automated purchasing and data collection that can assist both shoppers and sales associates, and lower costs as well.
As can be expected, however, this kind of conversion requires a fair amount of expertise, not just in software development but infrastructure services as well. Quite often companies lack visibility into their own development processes to determine where the bottlenecks are, let alone devise a set of best practices to improve flexibility and agility.
This means an audit is necessary to determine the current state of software development process and application deployment architecture, as well as the procedures in place to fulfill that mission. Few organizations maintain these capabilities internally, which is why it is good practice to bring in a specialist.
In software development, as in business, simpler is usually better and less costly. Removing the complex arrays of software code branches can do wonders for productivity, reliability and better satisfy the digital economy's constant demand for updates and new services. And most importantly, it lifts a huge weight off the shoulders of developers and finally allows them to do what they do best: innovate.
Industry News
Sonar announced two new product capabilities for today’s AI-driven software development ecosystem.
Redgate announced a wide range of product updates supporting multiple database management systems (DBMS) across its entire portfolio, designed to support IT professionals grappling with today’s complex database landscape.
Elastic announced support for Google Cloud’s Vertex AI platform in the Elasticsearch Open Inference API and Playground.
SmartBear has integrated the load testing engine of LoadNinja into its automated testing tool, TestComplete.
Check Point® Software Technologies Ltd. announced the completion of its acquisition of Cyberint Technologies Ltd., a highly innovative provider of external risk management solutions.
Lucid Software announced a robust set of new capabilities aimed at elevating agile workflows for both team-level and program-level planning.
Perforce Software announced the Hadoop Service Bundle, a new professional services and support offering from OpenLogic by Perforce.
CyberArk announced the successful completion of its acquisition of Venafi, a provider of machine identity management, from Thoma Bravo.
Inflectra announced the launch of its AI-powered SpiraApps.
The former Synopsys Software Integrity Group has rebranded as Black Duck® Software, a newly independent application security company.
Check Point® Software Technologies Ltd. announced that it has been recognized as a Visionary in the 2024 Gartner® Magic Quadrant™ for Endpoint Protection Platforms.
Harness expanded its strategic partnership with Google Cloud, focusing on new integrations leveraging generative AI technologies.
OKX announced the launch of OKX OS, an onchain infrastructure suite.