The Future Is Smart: Cloud Native + AI
April 21, 2022

Tobi Knaup
D2iQ

Leading organizations around the world are adopting cloud native technologies to build next- generation products and achieve the agility that they need to stay ahead of their competition. Although cloud native and Kubernetes are very disruptive technologies, there is another technology that is probably the most disruptive technology of our generation — artificial intelligence (AI) and its subset, machine learning (ML).

We already see AI in digital assistants like Siri and Alexa, chatbots on websites and recommendation engines on retail sites. In the near future, AI will be embedded in almost all the products that surround us, from self-driving cars to next-generation medical devices.

Organizations that are building cloud-native applications today will need to evolve their capabilities to manage AI workloads because the next generation of cloud-native applications will have AI at their core. We call those "smart cloud-native" applications because they have AI built in.

Kubernetes a Perfect Match for AI

Kubernetes has become the enterprise cloud-native platform of choice and is a natural fit for running AI and ML workloads for a number of reasons:

■ Kubernetes can easily scale to meet the resource needs of AI/ML training and production workloads.

■ Kubernetes enables sharing of expensive and limited resources like graphics processing units between developers to speed up development and lower costs.

■ Kubernetes provides a layer of abstraction that enables data scientists to access the services they require without worrying about the details of the underlying infrastructure.

■ Kubernetes gives organizations the agility to deploy and manage AI/ML operations across public clouds, private clouds, on-premise, and secure air-gap locations, and to easily change and migrate deployments without incurring excess cost. A smart cloud-native business application consists of a number of components, including microservices, data services, and AI/ML pipelines. Kubernetes provides a single consistent platform on which to run all workloads, rather than in silos, which simplifies deployment and management and minimizes cost.

■ As an open-source cloud-native platform, Kubernetes enables organizations to apply cloud-native best practices and take advantage of continuous open-source innovation. Many of the modern AI/ML technologies are open source as well and come with native Kubernetes integration.

Smart Cloud-Native Challenges

Organizations that want to build smart cloud-native apps must also learn how to deploy those workloads in the cloud, in data centers, and at the edge. AI as a field is relatively young, so the best practices for putting AI applications into production are few and far between. The good news is that many of the best practices that exist around putting cloud native applications into production transfer easily to AI applications.

However, AI-driven smart cloud-native applications pose additional challenges for operators once in production because AI and ML pipelines are complex workloads made up of many components that run elastically and need to be updated frequently. This means that organizations need to start building operational capabilities around those AI workloads.

Cloud-native technologies have been around for about a decade, and enterprises are increasingly moving their most mission-critical workloads to cloud-native platforms like Kubernetes. This creates a slew of new challenges for organizations:

■ First, because those workloads are so mission-critical, it puts a much higher burden on operations teams to keep those workloads running 24/7 while making sure they are resilient, can scale, and are secure.

■ Second, those workloads tend to include more sophisticated technologies like data workloads, AI workloads, and machine learning workloads, which have their own operational challenges.

■ Third, modern cloud-native applications tend to run on a broad range of infrastructures, from a cloud provider or multiple cloud providers to data centers and edge deployments.

A Firm and Future-Proof Foundation

Organizations that want to adopt cloud-native technology must figure out how to address these challenges. To do this they need to change their workflows and culture to take full advantage of cloud native’s potential. They must learn how to build applications in a cloud-native way and to adopt the technologies that enable them to put those applications into production in a resilient and repeatable way.

The speed of innovation in the cloud-native ecosystem is unparalleled. Organizations that can keep pace with that innovation and learn how to adopt cloud-native and AI technologies will be able to build highly differentiated products that can put them ahead of their competition. They will be able to build their next-generation products much faster and in a more agile way, and they will be able to leverage AI to build smarter products.

Tobi Knaup is Co-Founder and CEO of D2iQ
Share this

Industry News

August 29, 2024

Progress announced the latest release of Progress® Semaphore™, its metadata management and semantic AI platform.

August 29, 2024

Elastic, the Search AI Company, announced the Elasticsearch Open Inference API now integrates with Anthropic, providing developers with seamless access to Anthropic’s Claude, including Claude 3.5 Sonnet, Claude 3 Haiku and Claude 3 Opus, directly from their Anthropic account.

August 28, 2024

Broadcom unveiled VMware Cloud Foundation (VCF) 9, the future of VCF that will accelerate customers’ transition from siloed IT architectures to a unified and integrated private cloud platform that lowers cost and risk.

August 27, 2024

Broadcom announced VMware Tanzu Platform 10, a cloud native application platform that accelerates software delivery, providing platform engineering teams enhanced governance and operational efficiency while reducing toil and complexity for development teams.

August 26, 2024

Red Hat announced the general availability of Red Hat OpenStack Services on OpenShift, the next major release of Red Hat OpenStack Platform.

August 26, 2024

Salesforce announced new innovations in Slack that make it easier for users to build automations, no matter their technical expertise.

August 26, 2024

GitLab announced the general availability of the GitLab Duo Enterprise add-on.

August 26, 2024

Tigera now delivers universal microsegmentation capabilities with Calico.

August 22, 2024

Tabnine announced a new platform partnership with Broadcom Inc., an integration with IBM, as well as continuing extensions of existing partnerships with Amazon Web Services (AWS), DigitalOcean, Google Cloud, and Oracle Cloud Infrastructure (OCI).

August 22, 2024

Wallarm released API Attack Surface Management (AASM), an agentless technology to help organizations identify, analyze, and secure their entire API attack surface.

August 21, 2024

LambdaTest launched KaneAI, an end-to-end software AI Test Agent.

August 20, 2024

Kubiya has closed its $12 million seed round with a $6 million extension of equity and debt financing and launched a paradigm-breaking new platform, AI Teammates, that enables true delegation of complex tasks to digital colleagues through organic, human-like conversations.

August 19, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the schedule for KubeCon + CloudNativeCon North America 2024, happening in Salt Lake City, Utah from November 12 – 15.

August 19, 2024

Diagrid announced the latest version of Dapr, a Cloud Native Computing Foundation incubating project maintained by Diagrid, Microsoft, Intel, Alibaba, and others, as well as an update to Conductor, a Software as a Service (SaaS) that helps manage, upgrade, and monitor Dapr on Kubernetes clusters.

August 15, 2024

Spectro Cloud announced two new formal recognitions of its strengthening position in the government technology space: the Government Software competency from AWS, and ‘Awardable’ status on the CDAO Tradewinds Solutions Marketplace for AI/ML solutions at the tactical edge.