Docker, Inc.® announced Docker Hardened Images (DHI), a curated catalog of security-hardened, enterprise-grade container images designed to meet today’s toughest software supply chain challenges.
Leading organizations around the world are adopting cloud native technologies to build next- generation products and achieve the agility that they need to stay ahead of their competition. Although cloud native and Kubernetes are very disruptive technologies, there is another technology that is probably the most disruptive technology of our generation — artificial intelligence (AI) and its subset, machine learning (ML).
We already see AI in digital assistants like Siri and Alexa, chatbots on websites and recommendation engines on retail sites. In the near future, AI will be embedded in almost all the products that surround us, from self-driving cars to next-generation medical devices.
Organizations that are building cloud-native applications today will need to evolve their capabilities to manage AI workloads because the next generation of cloud-native applications will have AI at their core. We call those "smart cloud-native" applications because they have AI built in.
Kubernetes a Perfect Match for AI
Kubernetes has become the enterprise cloud-native platform of choice and is a natural fit for running AI and ML workloads for a number of reasons:
■ Kubernetes can easily scale to meet the resource needs of AI/ML training and production workloads.
■ Kubernetes enables sharing of expensive and limited resources like graphics processing units between developers to speed up development and lower costs.
■ Kubernetes provides a layer of abstraction that enables data scientists to access the services they require without worrying about the details of the underlying infrastructure.
■ Kubernetes gives organizations the agility to deploy and manage AI/ML operations across public clouds, private clouds, on-premise, and secure air-gap locations, and to easily change and migrate deployments without incurring excess cost. A smart cloud-native business application consists of a number of components, including microservices, data services, and AI/ML pipelines. Kubernetes provides a single consistent platform on which to run all workloads, rather than in silos, which simplifies deployment and management and minimizes cost.
■ As an open-source cloud-native platform, Kubernetes enables organizations to apply cloud-native best practices and take advantage of continuous open-source innovation. Many of the modern AI/ML technologies are open source as well and come with native Kubernetes integration.
Smart Cloud-Native Challenges
Organizations that want to build smart cloud-native apps must also learn how to deploy those workloads in the cloud, in data centers, and at the edge. AI as a field is relatively young, so the best practices for putting AI applications into production are few and far between. The good news is that many of the best practices that exist around putting cloud native applications into production transfer easily to AI applications.
However, AI-driven smart cloud-native applications pose additional challenges for operators once in production because AI and ML pipelines are complex workloads made up of many components that run elastically and need to be updated frequently. This means that organizations need to start building operational capabilities around those AI workloads.
Cloud-native technologies have been around for about a decade, and enterprises are increasingly moving their most mission-critical workloads to cloud-native platforms like Kubernetes. This creates a slew of new challenges for organizations:
■ First, because those workloads are so mission-critical, it puts a much higher burden on operations teams to keep those workloads running 24/7 while making sure they are resilient, can scale, and are secure.
■ Second, those workloads tend to include more sophisticated technologies like data workloads, AI workloads, and machine learning workloads, which have their own operational challenges.
■ Third, modern cloud-native applications tend to run on a broad range of infrastructures, from a cloud provider or multiple cloud providers to data centers and edge deployments.
A Firm and Future-Proof Foundation
Organizations that want to adopt cloud-native technology must figure out how to address these challenges. To do this they need to change their workflows and culture to take full advantage of cloud native’s potential. They must learn how to build applications in a cloud-native way and to adopt the technologies that enable them to put those applications into production in a resilient and repeatable way.
The speed of innovation in the cloud-native ecosystem is unparalleled. Organizations that can keep pace with that innovation and learn how to adopt cloud-native and AI technologies will be able to build highly differentiated products that can put them ahead of their competition. They will be able to build their next-generation products much faster and in a more agile way, and they will be able to leverage AI to build smarter products.
Industry News
GitHub announced that GitHub Copilot now includes an asynchronous coding agent, embedded directly in GitHub and accessible from VS Code—creating a powerful Agentic DevOps loop across coding environments.
Red Hat announced its integration with the newly announced NVIDIA Enterprise AI Factory validated design, helping to power a new wave of agentic AI innovation.
JFrog announced the integration of its foundational DevSecOps tools with the NVIDIA Enterprise AI Factory validated design.
GitLab announced the launch of GitLab 18, including AI capabilities natively integrated into the platform and major new innovations across core DevOps, and security and compliance workflows that are available now, with further enhancements planned throughout the year.
Perforce Software is partnering with Siemens Digital Industries Software to transform how smart, connected products are designed and developed.
Reply launched Silicon Shoring, a new software delivery model powered by Artificial Intelligence.
CIQ announced the tech preview launch of Rocky Linux from CIQ for AI (RLC-AI), an operating system engineered and optimized for artificial intelligence workloads.
The Linux Foundation, the nonprofit organization enabling mass innovation through open source, announced the launch of the Cybersecurity Skills Framework, a global reference guide that helps organizations identify and address critical cybersecurity competencies across a broad range of IT job families; extending beyond cybersecurity specialists.
CodeRabbit is now available on the Visual Studio Code editor.
The integration brings CodeRabbit’s AI code reviews directly into Cursor, Windsurf, and VS Code at the earliest stages of software development—inside the code editor itself—at no cost to the developers.
Chainguard announced Chainguard Libraries for Python, an index of malware-resistant Python dependencies built securely from source on SLSA L2 infrastructure.
Sysdig announced the donation of Stratoshark, the company’s open source cloud forensics tool, to the Wireshark Foundation.
Pegasystems unveiled Pega Predictable AI™ Agents that give enterprises extraordinary control and visibility as they design and deploy AI-optimized processes.
Kong announced the introduction of the Kong Event Gateway as a part of their unified API platform.
Azul and Moderne announced a technical partnership to help Java development teams identify, remove and refactor unused and dead code to improve productivity and dramatically accelerate modernization initiatives.