2024 DevOps Predictions - Part 8
December 14, 2023

Industry experts offer thoughtful, insightful, and often controversial predictions on how DevOps and related technologies will evolve and impact business in 2024. Part 8 covers AI's impact on DevOps and development.

Start with: 2024 DevOps Predictions - Part 1

Start with: 2024 DevOps Predictions - Part 2

Start with: 2024 DevOps Predictions - Part 3

Start with: 2024 DevOps Predictions - Part 4

Start with: 2024 DevOps Predictions - Part 5

Start with: 2024 DevOps Predictions - Part 6

Start with: 2024 DevOps Predictions - Part 7

AI CHANGES EXPECTATIONS OF DEVELOPER PRODUCTIVITY

Automation tools will make a more visible impact on developer velocity and how developers' work is measured. This year's explosion of AI and ML is instigating an unparalleled transformation in business productivity expectations. In 2024, the extended accessibility to AI- and ML-driven automation tools will continue to elevate the benchmarks for code quality, reliability, and security, in response to the escalating demand for expedited software delivery.
Sairaj Uddin
SVP of Technology, The Trade Desk

Frontrunners Will Crack the Productivity Code

We keep talking about first mover advantage but in 2024, we'll see that disappear as best practices emerge by mid-year and gain widespread adoption later in the year. By the end of 2024, those who have adopted AI-assisted code tools and cracked the code on how to use AI well will be outperforming companies that are not. For DevOps teams already on this bandwagon, information sharing and increased productivity will become standard, enabling them to monetize and scale their successes.
Wing To
GM of Intelligent DevOps, Digital.ai

AI Advances abstraction in programming

We will see a significant leap in how AI advances abstraction for developers. As developers have looked to increase efficiencies, they have abstracted out the common and mundane tasks. Each new language, framework, and SDK that comes along abstracts another level of tasks that developers don't need to worry about. AI will take abstraction to the next level. AI-powered reference architectures will give developers a jump on starting new projects or lend a hand when solving complex problems. Developers will no longer begin with a blank slate. Instead, AI will help remove the intimidation of an empty page to jump start projects and streamline workflows.
Scott McAllister
Principal Developer Advocate, ngrok

AI transforms Kubernetes into automatic transmission system of cloud-native era

AI is poised to redefine how businesses utilize Kubernetes for application deployment and managing their infrastructure. Similar to how automatic transmissions streamlined driving, AI will become the automatic transmission for Kubernetes. AI will serve as the bridge between Kubernetes' inherent complexity and accessibility so that even entry-level team members will be able to efficiently navigate and manage Kubernetes environments. AI will act as an intelligent guide, simplifying intricate operations and offering real-time insights. It will not only automate issue detection but also empower less experienced staff to operate Kubernetes proficiently. This empowerment will optimize the workforce, reducing the need for extensive training or specialized knowledge. Consequently, businesses will be able to streamline their operations, reduce human intervention, and significantly cut operational costs, making the adoption of Kubernetes even more feasible and economical.
Mohan Atreya
SVP Product and Solutions, Rafay

API Plays Increasing Role in AI

APIs will play an increasing role in the growth of AI. They will grow into the de-facto mechanism that AI agents use to increase their access to the world — both to integrate data from myriad sources, and to operate on the world around them.
Abhijit Kane
Co-Founder, Postman

GENAI ACCELERATES OPEN SOURCE DEVELOPMENT

Generative AI will accelerate the impact of small, open source software teams. History shows that tiny teams — sometimes just one person! — can have outsized impact with open source. Generative AI is going to amplify this "open source impact effect" to incredible new levels. When we look at the cost of developing open source, actually writing the code itself isn't the expensive part. It's the documentation, bug handling, talking to people, responding to requests, checking examples of code on GitHub and more — all of which is very human-intensive. The open source community will benefit from generative AI for the same reason so many other efforts will: efficient elimination of tiresome human tasks. By helping with all of that, large language models will accelerate open source development this upcoming year, making smaller teams even more powerful.
Adrien Treuille
Director of Product Management and Head of Streamlit, Snowflake

AI ENABLES DEVOPS TO BE MORE PREDICTIVE

AI will enable DevOps and SecOps to become more predictive in nature. With the explosion of AI, DevOps will grow to be more predictive in nature, adjusting proactively to release and update software. In particular, DevOps and SecOps will join more closely to anticipate and remediate threats through predictive DevOps.
Ed Frederici
CTO, Appfire

ESTABLISHED FRAMEWORKS HAVE THE EDGE

More established frameworks are going to have a leg up compared to newer frameworks on which AI hasn't been as deeply trained. Which frameworks frontend devs choose to use and the unique skills they bring to the table will impact the job market and how development processes change.
Rita Kozlov
Senior Product Director, Cloudflare

DEV-TO-OPS RATIO INCREASES

The dev-to-ops headcount ratio will increase in 2024. Thanks to advances in AI co-pilots, developers are more efficient than ever. However, the Ops side remains slow to adopt and benefit from GenAI technologies. Unlike development, Ops functions require accurate and predictable outcomes — not a point of strength for LLMs. Many organizations have worked to apply AI models to production troubleshooting. Yet, none of these efforts have borne fruit, as production systems like Kubernetes and AWS are far too complex to be made fully autonomous. As development teams increasingly benefit from co-pilots and Ops teams become further marred in increasing complexities, the dev-to-ops headcount ratio has declined (i.e., we see fewer developers than Ops professionals). This is highly problematic for the industry. Thankfully, we'll see a reversal of fortune in 2024 as AI models improve, producing greater efficiencies in Ops workflows, and as investment into DevOps platforms increases, creating better tooling and higher abstraction abilities.
Sheng Liang
Co-Founder and CEO, Acorn Labs

VECTOR DATABASES PLAY KEY ROLE IN TECH STACK

As new applications get built from the ground up with AI, and as LLMs become integrated into existing applications, I believe vector databases will play an increasingly important role in the tech stack, just as application databases have in the past. Teams will need scalable, easy to use, and operationally simple vector data storage as they seek to create AI-enabled products with new LLM-powered capabilities
Avthar Sewrathan
GM for AI and Vector, Timescale

MLOps Integrates with DevOps

In 2024, MLOps will increasingly integrate with DevOps to create more streamlined workflows for AI projects. The combination of MLOps and DevOps creates a set of processes and automated tools for managing data, code and models to enhance the efficiency of machine learning platforms. Data scientists and software developers will get the freedom to transition to high-value projects without the need for manually overseeing models. The trend is driven by streamlining the process of delivering models to production to reduce time-to-value.
Haoyuan Li
Founder and CEO, Alluxio

The Rise of AI-as-a-Service

It's already possible to use OpenAI's ChatGPT in your own applications, but being able to model responses based on your own, proprietary datasets will bring much more value to businesses. This leads to issues of data sovereignty and confidentiality, which will see the rise of not just cloud-based AI services, but the ability to run them in siloed cloud-environments.
Ben Dechrai
Developer Advocate, Sonar

Evolving beyond the chatbot

The breakout star of generative AI has been ChatGPT; subsequently 2023 saw most interfaces to generative AI via chat. As designers and developers work with the technology, and as more specialized LLMs are produced, we'll see AI fade into the background, but we'll see more powerful applications built upon it. Right now, chatbots are hammers and everything looks like a nail, to truly use AI to its full potential we will need to move beyond that.
Phil Nash
Developer Advocate, Sonar

Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.