6 Kubernetes Pain Points and How to Solve Them - Part 1
March 05, 2018

Kamesh Pemmaraju
ZeroStack

Companies want to implement modern applications that can be used anytime, anywhere by always-connected users who demand instant access and improved services. Developing and deploying such applications requires development teams to move fast and deploy software efficiently, while IT teams have to keep pace and also learn to operate at large scale.

While the concept has been around for a couple of decades, containers staged a comeback in the last 3-4 years because they are ideally suited for the new world of massively scalable cloud-native applications. Containers are extremely lightweight, start much faster (than VMs), and use a fraction of the memory compared to booting an entire operating system. More importantly, they enable applications to be abstracted from the environment in which they actually run. Containerization provides a clean separation of concerns, as developers focus on their application logic and dependencies while IT operations teams can focus on deployment and management without bothering with application details.

Deploying and managing containers is still a significant challenge, however. In the past couple of years, Kubernetes burst onto the scene and became the de facto leader as the open-source container orchestrator for deploying and managing containers at scale. The hype has reached such a peak now that there are as many as 30 Kubernetes distribution vendors and over 20 Container-as-a-Service companies out there. All the major public clouds (AWS, Azure, and Google Cloud) provide Container-as-a-Service based on Kubernetes.

With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale. Far from it. There are six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments, and there are also some best practices companies can use to address those pain points.

Pain Point 1 - Enterprises have diverse infrastructures

Bringing up a single Kubernetes cluster on a homogenous infrastructure is relatively easy with the current solutions in the market. But the reality is that organizations have diverse infrastructures using different server, storage, and networking vendors. In this situation, automating infrastructure deployment, setting up, configuring, and upgrading Kubernetes to work consistently is not easy.

One way to address this challenge is to deploy a unifying platform that abstracts the diversity of underlying infrastructure (physical server, storage, and networking) and offers standard open API access to infrastructure resources. This greatly simplifies the IT burden when it comes to provisioning Kubernetes.

Pain Point 2 - One Kubernetes cluster doesn't address all needs

Organizations have diverse applications teams, application portfolios, and sometimes conflicting user requirements. One Kubernetes cluster is not going to meet all of those needs. Companies will need to deploy multiple, independent Kubernetes clusters with possibly different underlying CPU, memory, and storage footprints. If deploying one cluster on diverse hardware is hard enough, doing so with multiple clusters is going to be a nightmare!

To address this pain point, the IT team should be able to set up logical business units that can be assigned to different application teams. This way, each application team gets full self-service capability within quota limits imposed by the IT team, and each team can automatically deploy its own Kubernetes cluster with a few clicks, independently of other teams.

Read 6 Kubernetes Pain Points and How to Solve Them - Part 2

Kamesh Pemmaraju is VP of Product at ZeroStack
Share this

Industry News

February 27, 2020

Datadog announced an integration with Nessus from Tenable.

February 26, 2020

Perforce Software released a free tool for organizations considering open source software - OpenLogic Stack Builder.

February 26, 2020

Applause announced a new partnership with Infosys to provide broader end-to-end digital experience testing services to clients.

February 26, 2020

RapidMiner announced the release of its platform enhancement, RapidMiner 9.6. This update prioritizes people – not technology – at the center of the enterprise AI journey, providing new, unique experiences to empower users of varying backgrounds and abilities.

February 25, 2020

JFrog announced the availability of the "JFrog Platform," a hybrid, multi-cloud, universal DevOps platform.

February 25, 2020

Nureva added new agile canvas templates to Span Workspace, including a heat map developed by Jeff Sutherland, the co-creator of Scrum and founder of Scrum Inc. and Scrum@Scale.

February 25, 2020

Agiloft announced the addition of its new Agiloft AI Engine, complete with prebuilt AI Capabilities for contract management and an open AI integration that allows customers to incorporate custom-built AI tools into the no-code platform.

February 24, 2020

Cloudify announced that its latest product update - Cloudify version 5 - features an Environment as a Service component, designed to achieve consistent delivery and management of hybrid-cloud services and network infrastructures across CI/CD pipelines - at scale.

February 24, 2020

Checkmarx announced new enhancements to its Software Security Platform to empower more seamless implementation and automation of application security testing (AST) in modern development and DevOps environments.

February 24, 2020

Rapid7 and Snyk announced a strategic partnership to deliver end-to-end application security to organizations developing cloud native applications.

February 20, 2020

The American Council for Technology and Industry Advisory Council (ACT-IAC), the premier public-private partnership dedicated to advancing government through the application of information technology, officially announced the release of the DevOps Primer.

It was produced through a collaborative, volunteer effort by a working group from government and industry, hosted by the ACT-IAC Emerging Technology Community of Interest (COI).

February 20, 2020

DLT Solutions, a subsidiary of Tech Data, launched the Secure Software Factory (SSF), a framework that provides the U.S. public sector with consistent development and deployment of high-quality, scalable, resilient and secure software throughout an application’s lifecycle.

February 20, 2020

Netography announced the general availability of the company’s Security Operations Platform.

February 19, 2020

Perfecto by Perforce announced its integration with Katalon Studio.

February 19, 2020

Radware announced the Alteon Cloud Control as part of its Alteon Multi-Cloud Solution designed to simplify the deployment of secured application delivery services across public and private cloud environments.