Kubiya.ai announces the launch of its DevOps Digital Agents.
Nearly 70% of DevOps and security professionals want to cut their tech stack, according to Gitlab's Global DevSecOps Survey. As many teams experience tech stack sprawl and tightening budgets, paring down seems ideal. But be wary — cutting too much can be counterproductive.
IT leaders must walk the line between enabling efficiency with fewer apps and removing crucial functionalities by over-consolidation. Not all tech tools increase complexity — some streamline and improve your stack. In the case of DevOps, efficient and reliable code deployments require a well-defined functional toolchain, which automation capabilities enhance.
The Benefits of Consolidation
Many teams use between six and ten tools, leading to a third of developers spending at least half their time on toolchain integration and maintenance, according to the GitLab survey. Using many applications with similar functionalities hinders productivity because development teams must maintain knowledge of each. A streamlined toolset mitigates some software development complexity. By reducing processes and user interfaces, developers have more time to write code. Smaller tech stacks are also more agile and scalable.
Consolidation drives cost savings, a pressing need for many decision-makers. Additionally, a large tool stack can compromise security by creating more opportunities for bad actors.
However, there is no such thing as an "all-in-one" solution, so you'll never prune your roster down to one. A tech stack that's too lean creates the same problems as one that's too complex. If IT leaders remove necessary tools or replace them with less effective solutions, development teams will find themselves executing more manual tasks, detracting from coding time.
Consolidation Considerations
The organization's end goal should be a smooth software lifecycle with high DORA metrics that generates quality applications. Disparate tools add steps and complexity, but a certain number are required to facilitate an effective process. In this situation, adding a pipeline orchestration solution like continuous deployment is the right move.
While this appears to be layering yet another tool, you're actually providing process control, cohesion and optimized functionality. Each run of the pipeline saves your company money because it automates tasks that would otherwise require manual time and effort. The key difference between new application spend and automation spend is new applications create ongoing maintenance costs while well-designed automation improves existing functions to be more cost-effective and drive business value.
Continuous deployment tools are, in a sense, subtraction by addition. The process validates and ensures integration between deployment tools and functions to seamlessly move code from one stage to the next without manual intervention. Teams can more easily test code in a production-like environment and roll back unhealthy or poor-performing software before it crashes the system. Enforcing consistent validation of production changes saves your company from lost revenue due to downtime from bad changes.
Continuous deployment also saves cloud technology costs by enabling automated tagging so you can understand who is responsible for what infrastructure. This insight enables you to relay optimization advice to the teams that can activate it.
More than half of surveyed teams report deployment challenges caused by too many manual steps and a lack of consistency. Continuous deployment makes releasing updates simple, reliable, predictable and repeatable and requires less manual effort for better customer experience, software performance and business value.
Consolidation is not the only mechanism for cost savings. By cutting too many tools, your team returns to a clunky, inefficient process. Focus on building an efficient and productive stack that maximizes the tools' value and your team's output.
Industry News
Aviatrix® introduced Aviatrix Distributed Cloud Firewall for Kubernetes, a distributed cloud networking and network security solution for containerized enterprise applications and workloads.
Stride announces the general availability of Stride Conductor, its new autonomous coding product that transforms the software development landscape.
CircleCI unveiled CircleCI releases, which enables developers to automate the release orchestration process directly from the CircleCI UI.
Fermyon™ Technologies announces Fermyon Platform for Kubernetes, a WebAssembly platform for Kubernetes.
Akuity announced a new offer targeted at Enterprises and businesses where security and compliance are key.
New Relic launched new capabilities for New Relic IAST (Interactive Application Security Testing), including proof-of-exploit reporting for application security testing.
OutSystems announced AI Agent Builder, a new solution in the OutSystems Developer Cloud platform that makes it easy for IT leaders to incorporate generative AI (GenAI) powered applications into their digital transformation strategy, as well as govern the use of AI to ensure standardization and security.
Mirantis announced significant updates to Lens Desktop that makes working with Kubernetes easier by simplifying operations, improving efficiency, and increasing productivity. Lens 2024 Early Access is now available to Lens users.
Codezero announced a $3.5 million seed-funding round led by Ballistic Ventures, the venture capital firm dedicated exclusively to funding entrepreneurs and innovations in cybersecurity.
Prismatic launched a code-native integration building experience.
Check Point® Software Technologies Ltd. announced its Check Point Infinity Platform has been ranked as the #1 Zero Trust Platform in the latest Miercom Zero Trust Platform Assessment.
Tricentis announced the launch and availability of SAP Test Automation by Tricentis as an SAP Solution Extension.
Netlify announced the general availability of the AI-enabled deploy assist.
DataStax announced a new integration with Airbyte that simplifies the process of building production-ready GenAI applications with structured and unstructured data.