The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the launch of the Cybersecurity Skills Framework, a global reference guide that helps organizations identify and address critical cybersecurity competencies across a broad range of IT job families; extending beyond cybersecurity specialists.
Software engineers are currently caught between a rock and a hard place.
The rock? They're under record pressure to produce and release new software.
The hard place? They're increasingly expected to account for the safety, security and provenance of every single software asset they use in those builds. That's demonstrated in the rise of the Software Bill of Materials (SBOM).
These two clashing requirements are a source of great anxiety for software engineers, who are now forced to learn a new discipline while being simultaneously expected to do their existing jobs faster.
The Rise of the SBOM
SBOMs have risen due to a growing need for transparency in the software development process.
It's easy to understand how this has come about. The supply chain clearly needs greater transparency and continues to be the source of much insecurity and technology failure. In August 2024, for example, a faulty update to the widely used Zoom software(link is external) caused outages for millions of customers, paralyzing businesses and education institutions alike. In January 2025, another faulty update caused outages for many organizational Slack users who found they couldn't use one of their most fundamental business communication tools.
As the world becomes more tightly locked in the digital realm, we become ever more vulnerable to risks in the supply chain. Furthermore, there's a growing body of regulation — including a variety of presidential Executive Orders and National Institute of Standards and Technologies (NIST) guidelines — which compel organizations to account for those risks or face compliance penalties.
All of this has led to the rise of the SBOM, in which the components and dependencies of a piece of software are cataloged for the inspection of clients, channel partners, regulators and subsequent links in the supply chain.
While this is a broadly positive development, it is also a cause of great worry for software engineers who are now being forced to account for every single component used within their releases.
"That's Not My Job"
It's important to note that software engineers are not security professionals, but in some important ways, they are now being asked to be.
Software engineers pick and choose from various third-party and open source components and libraries. They do so — for the most part — with little analysis of the security of those components. Those components can be — or become — vulnerable in a whole variety of ways: Once-reliable code repositories can become outdated or vulnerable, zero days can emerge in trusted libraries, and malicious actors can — and often do — infect the supply chain. On top of that, risk profiles can change overnight, making what was a well considered design choice into a vulnerable one almost overnight.
Software engineers never before had to consider these things, and yet the arrival of the SBOM is making them do so like never before. Customers can now scrutinize their releases, and then potentially reject or send them back for fixing — resulting in even more work on short notice and piling on pressure. Even if the risk profile of a particular component changes between the creation of an SBOM and a customer reviewing it, then the release might be rejected.
This is understandably the cause of much frustration for software engineers who are often already under great pressure. The structural conditions that are now bearing down on software engineers can't be dismissed, but they can be accommodated while taking the stress off already-stressed development teams.
Assisting Engineers Around the Learning Curve
Principally, software dev teams need to know how the build decisions they make become vulnerable, so they can design with security in mind, and create reliable SBOMs.
That will mean integrating security insight into the development pipelines that software engineers work within. For example, being able to change and track dependencies so that the dev team will immediately know whether a particular choice will create software failures or security vulnerabilities later down the line. It will also let them know whether the components and libraries they once-thought reliable are now vulnerable or outdated.
Threat detection scans can add another layer of insight, offering a look at the source code level and providing a risk profile of both a release's behavior and its dependencies. It's this kind of insight during development that will allow software engineers to climb that steep learning curve that the SBOM presents. It will also furnish those development teams with proof points that they designed a given release securely, even if those releases have since become outdated or vulnerable. That ability to track and catalog changes in the risk profile of components and dependencies must also extend past dates of release, so that engineers can be involved in the continuing security of their creations.
SBOMs are here to stay and rightly so. We've reached a global level of digital complexity in which we have to know what kinds of components and technologies we're dealing with at every stage of the supply chain. A lot of that added responsibility is now falling to software engineers and they need ways to ease the mounting pressure. The SBOM shouldn't be resisted, but it can be accommodated and building SBOM creation tools into the development process can ease that pressure and help engineers adapt to the new reality.
Industry News
CodeRabbit is now available on the Visual Studio Code editor.
The integration brings CodeRabbit’s AI code reviews directly into Cursor, Windsurf, and VS Code at the earliest stages of software development—inside the code editor itself—at no cost to the developers.
Chainguard announced Chainguard Libraries for Python, an index of malware-resistant Python dependencies built securely from source on SLSA L2 infrastructure.
Sysdig announced the donation of Stratoshark, the company’s open source cloud forensics tool, to the Wireshark Foundation.
Pegasystems unveiled Pega Predictable AI™ Agents that give enterprises extraordinary control and visibility as they design and deploy AI-optimized processes.
Kong announced the introduction of the Kong Event Gateway as a part of their unified API platform.
Azul and Moderne announced a technical partnership to help Java development teams identify, remove and refactor unused and dead code to improve productivity and dramatically accelerate modernization initiatives.
Parasoft has added Agentic AI capabilities to SOAtest, featuring API test planning and creation.
Zerve unveiled a multi-agent system engineered specifically for enterprise-grade data and AI development.
LambdaTest, a unified agentic AI and cloud engineering platform, has announced its partnership with MacStadium(link is external), the industry-leading private Mac cloud provider enabling enterprise macOS workloads, to accelerate its AI-native software testing by leveraging Apple Silicon.
Tricentis announced a new capability that injects Tricentis’ AI-driven testing intelligence into SAP’s integrated toolchain, part of RISE with SAP methodology.
Zencoder announced the launch of Zen Agents, delivering two innovations that transform AI-assisted development: a platform enabling teams to create and share custom agents organization-wide, and an open-source marketplace for community-contributed agents.
AWS announced the preview of the Amazon Q Developer integration in GitHub.
The OpenSearch Software Foundation, the vendor-neutral home for the OpenSearch Project, announced the general availability of OpenSearch 3.0.