DevOps experts — analysts and consultants, users and the top vendors — offer thoughtful, insightful, and sometimes controversial predictions on how DevOps and related technologies will evolve and impact business in 2018. Part 6 covers analytics and data.
Start with 2018 DevOps Predictions - Part 1
Start with 2018 DevOps Predictions - Part 2
Start with 2018 DevOps Predictions - Part 3
Start with 2018 DevOps Predictions - Part 4
Start with 2018 DevOps Predictions - Part 5
DEVOPS FOCUS ON BUSINESS METRICS
With digital transformation, DevOps becomes central to the business — rather than just some discipline IT adopts to improve its own departmental performance. So C-level management must hold their digital leaders accountable in highly tangible and concrete ways. That's why – in addition to improving the performance of your DevOps/Continuous Delivery pipelines — you must also capture metrics that quantitatively prove how you're improving that performance. Low-level operational metrics won't do the trick. The metrics you present to upper management must tie back to business value. They're not interested in a bunch of numbers that show how much code you're getting out the door. They want to know how quickly you got something with quantifiable value to market — preferably quantified in terms revenue dollars gained by increasing revenue days. To mitigate this issue, DevOps teams will adopt digital experience monitoring and analytics solutions that correlate data — from the point of customer engagement to back-end business processes.
VP of DevOps Solution Marketing and Management, CA Technologies
DEVOPS FOR THE DATABASE
2018 will be the year of the database as far as DevOps is concerned. While Continuous Delivery is already highly adopted for Application Development, databases have been left behind — with only 37 percent adopting similar technologies and tools for their database environments. With development lead time is expected to continue and grow in 2018, databases will have to catch up with the tools now available in the market for database release automation.
Co-Founder and CTO, DBmaestro
The DBA will take on a critical role in DevOps. 2017 was the year of DevOps, no question about it, and upstream development and continuous integration was the focus for many organizations. In 2018, we'll see a focus on downstream testing, release and deployment and continuous delivery processes. Due to their complex nature and common data movement challenges, databases tend to pose major bottlenecks to DevOps teams and processes. The DBA will play an integral role in alleviating these challenges and play a critical role in how businesses move to enabling DevOps digital transformation.
Executive Director, Software Engineering, Quest Software
DevOps for applications has already moved from the backroom to the boardroom. Its advantages are spreading across every business sector, and companies and organizations are now recognizing the database needs to be included as well. The benefits of releasing new features faster by introducing practices like continuous integration and automated deployments cannot be truly realized if, at the end of the process, database updates remain manual and problematic. Fortunately, database development and deployment tools are now emerging that integrate with and plug into the infrastructure already in place for applications. So rather than bringing in unfamiliar and unwelcome changes to the development process, DevOps for the database can be included by extending the use of the tools already in place. That way, everyone in Dev and Ops will gain the rewards that DevOps offers.
Relational database providers are making enterprise-grade database technology available to DevOps teams without the need for a DBA. With new support for Linux and containers, combined with AI-inspired self-healing and self-tuning capabilities, we will see relational database technology rise to prominence in DevOps teams creating a new generation of database-skilled developers.
VP of Product Management, Idera
Fewer and fewer companies will forget the database with DevOps. The database is the hardest part in the application stack to manage, so it just doesn't make sense that it's always the forgotten piece of the puzzle. IT teams have been so focused on time-to-market and getting development to push out applications at the speed of light, but still manually manage the change process of databases that contain massive amounts of information. The good news is that as more enterprises continue to modernize and adopt DevOps processes, it'll become harder to ignore the database. This is because DevOps is a process, an algorithm. It's not static and it can't be done some of the time. The whole purpose is to change and evolve over time. DevOps is about identifying friction that is slowing down software releases. Sometimes, it's the testing team setting up environments manually. It's time to automate environment creation to solve not just this one problem, but all problems across the IT department. It's time to stop having DBAs perform manual SQL script review prior to a release and start automating the review so that they can continue to innovate and bring strategic value to their organization. In 2018, more and more IT teams will start to see the benefit of bringing DevOps to other areas such as security and the database, and many will start to treat the database as a first-class citizen. Unfortunately, as this realization sets in, companies will regret that they didn't tackle the problem soon.
CTO and Co-Founder, Datical
We'll start hearing more about DataOps in 2018. Where DevOps aligns developers and IT teams to accelerate software delivery and infrastructure changes, DataOps is all about streamlining the preparation of data so developers can leverage it during the application building process. While the application of DataOps processes and strategies are in its early stages, we'll start to hear this term used more and more among the database community in 2018.
Executive Director, Software Engineering, Quest Software
In 2018 predictable DevOps for "Fast Data" will become a critical requirement. Developers are dealing with more data, increasingly closer to real-time, and the concept of "fast data" represents this shift that's been driven by a number of concepts with overlapping definitions — real-time, data in-motion, streaming data, etc. Everything from framework choices to application design are supporting that constant flow of data through the application. We're seeing an evolution where applications aren't just being integrated with big data, the applications are increasingly being built around the data use cases. And in 2018 the focus will sharpen on how DevOps supports this entirely new class of applications and systems.
MACHINE LEARNING AND AI
Artificial Intelligence will be the next big thing in DevOps. Artificial intelligence (AI) holds great promise for DevOps. As humans, we learn from trial-and-error and we share our tribal lore with less experienced members of our tribe. That is exactly the promise of AI and machine learning. We prize our database administrators (DBAs) with 20 years of experience because they have vast experience in what has (not) worked in the past and because they can see patterns in the issues they deal with daily. However, humans are limited in amount of data they can consume. Enter machine learning: if we are able to collect vast amounts of data on application change and its corresponding impact to our customers and systems, then it's known problem to identify patterns in that data. In turn, we can prevent bad behavior and encourage good behavior, all without having to wake up at 2 a.m. to respond to an on-call issue.
CTO and Co-Founder, Datical
In 2018, simplifying the life of the DevOps team will be an underlying theme. The use of machine learning, predictive and prescriptive analytics will assist in streamlining real-time data ingestion with the goal of creating a collaborative, agile enterprise operation. Rule-based and manual "subject matter expertise-based" approaches to problems and resolutions will transform due to machine learning technology and provide teams with predictive and prescriptive DevOps management.
AVP and Head of Product of StreamAnalytix, Impetus Technologies
AI will cause a revolution for companies that use continuous deployment and integration. It's impossible to track the multiple new microservice releases daily using traditional dashboards, and the missing insights can lead to missed problems. For example, if you're running AB tests that you release to certain geographic locations and not others. One of the tests may break the checkout process, but may take days or weeks to find since it's only in one location where the test was running. These updates are happening constantly, making them impossible to track manually. AI can keep track of every possible combination of end user device, OS, geographic region, page, event, and more, and even notify the appropriate team if something goes wrong anywhere from product updates to external issues.
The general trend of log analytics is the transition to "machine data analytics." As organizations transition to more modern DevOps-style processes, and use a microservices approach to their application development, the artificial divide between log analytics, infrastructure performance management, and application performance management is dissolving. What this means is that as developers become increasingly responsible for the operational management of their applications, they will now also be responsible for the instrumentation of their applications. As a result, they'll have to adopt the practices of log analytics — custom instrumentation using open formats — to their efforts to monitor their applications and infrastructure. This makes it possible to see a comprehensive, integrated view of the application from top level business KPIs, to operational metrics, to low-level logs -- all in one place.
Principal Product Manager, Sumo Logic
Read 2018 DevOps Predictions - Part 7, covering DevOps and the cloud.