Check Point® Software Technologies Ltd. has been recognized as a Leader in the latest GigaOm Radar Report for Security Policy as Code.
Kubernetes is increasingly important to organizations' DevOps journeys as they look to manage cloud-native container implementation. The 2021 Annual Survey conducted by the Cloud Native Computing Foundation (CNCF) revealed that 96 percent of respondents reported using or evaluating Kubernetes. Although the uptick of Kubernetes is unprecedented, the learning curve is steep. Organizations know they will benefit from Kubernetes adoption, but they don't necessarily have the skills and technical knowledge to get started.
With so much at stake in delivering better software faster, it is important to set up for Kubernetes success from the very beginning. I asked some industry experts if they had tips on how to get started, and here are the top tips I received:
Parveen Kr. Arora, Co-Founder & Director, VVnT SeQuor
There's a lot to learn about Kubernetes. A good starting point is to use Kubernetes' own vocabulary, which people can develop proficiency with over time. The glossary in the official documentation can help anyone get up to speed on the lingo. Also, there are plenty of readily available other ways to learn Kubernetes from, i.e., articles, books, courses and more. Then one can gain expertise by acquiring professional certifications.
Erez Barak, VP of Observability, Sumo Logic, SKILup Day Sponsor
Today, Kubernetes is a technology that has huge promise, but has a deep learning curve, and is in its early stages of maturity with some serious barriers to mainstream adoption. For organizations to get started with Kubernetes, leaders should first allocate time and investment for continuing education to give team members the time and space to up-level their skills. This continuing education provides employees a great growth opportunity and is wonderful to build a bench of skills inside of your organization.
The other way to get started is to determine a project that your team can experiment with and "play safely" with the new technology. As part of that experimentation, teams can help determine how Kubernetes will impact the rest of the organization (e.g., the processes and tooling required to deliver, run and monitor that software).
Vishnu Vasudevan, Head of Product Engineering & Management, Opsera
There are two ways to look at getting started with Kubernetes. If a person is looking into how to help their organization get started, that requires several questions to be answered beforehand. These would include things like, is containerized app development suitable for the company?
Are we directly deploying and managing Kubernetes ourselves or leveraging a Platform-as-a-Service (PaaS) approach?
Who needs to be involved, and what standard practices do we need to develop as a team?
Suppose a person is approaching how to get started with Kubernetes themselves. In that case, the first step is gaining a basic understanding of the cluster orchestration system through learning materials or online tutorials. For example, Kubernetes.io provides an interactive tutorial that covers six learning modules that will help an individual learn a basic understanding of how to: deploy a containerized application on a cluster, scale the deployment, update the containerized application with a new software version, and debugging the containerized application. There are plenty of free and online resources the Kubernetes community has produced to help people just get started, as well as dedicated workshops and local meetups to help even the most novice of practitioners learn how it works and why it is important.
There are many considerations before adopting Kubernetes. Learning the terminology, investing in education, experimentation, and leveraging available resources online are all practical ways to get started. For more insights about what you need for Kubernetes success, join us for a full day of "how-to" learning during SKILup Day: Enterprise Kubernetes on March 17, 2022.
Industry News
JFrog announced the addition of JFrog Runtime to its suite of security capabilities, empowering enterprises to seamlessly integrate security into every step of the development process, from writing source code to deploying binaries into production.
Kong unveiled its new Premium Technology Partner Program, a strategic initiative designed to deepen its engagement with technology partners and foster innovation within its cloud and developer ecosystem.
Kong announced the launch of the latest version of Kong Konnect, the API platform for the AI era.
Oracle announced new capabilities to help customers accelerate the development of applications and deployment on Oracle Cloud Infrastructure (OCI).
JFrog and GitHub unveiled new integrations.
Opsera announced its latest platform capabilities for Salesforce DevOps.
Progress announced it has entered into a definitive agreement to acquire ShareFile, a business unit of Cloud Software Group, providing SaaS-native, AI-powered, document-centric collaboration, focusing on industry segments including business and professional services, financial services, healthcare and construction.
Red Hat announced the general availability of Red Hat Enterprise Linux (RHEL) AI across the hybrid cloud.
Jitterbit announced its unified AI-infused, low-code Harmony platform.
Akuity announced the launch of KubeVision, a feature within the Akuity Platform.
Couchbase announced Capella Free Tier, a free developer environment designed to empower developers to evaluate and explore products and test new features without time constraints.
Amazon Web Services, Inc. (AWS), an Amazon.com, Inc. company, announced the general availability of AWS Parallel Computing Service, a new managed service that helps customers easily set up and manage high performance computing (HPC) clusters so they can run scientific and engineering workloads at virtually any scale on AWS.
Dell Technologies and Red Hat are bringing Red Hat Enterprise Linux AI (RHEL AI), a foundation model platform built on an AI-optimized operating system that enables users to more seamlessly develop, test and deploy artificial intelligence (AI) and generative AI (gen AI) models, to Dell PowerEdge servers.