Getting the Mainframe Up to DevOps Speed
January 23, 2017

Chris O'Malley
Compuware

Until recently, many IT leaders still believed they could allow their mainframe environments to languish in two-code-drops-a-year waterfall mode, while they embraced DevOps and Agile across their distributed and cloud environments.

This so-called "Bimodal lT" strategy has proven to be dangerously flawed. The fact is, if your business has a mainframe, that's probably where your most important applications and data live. As such, there's no way your business can remain competitive unless you can quickly adapt your use of those applications and data to keep pace with your rapidly and relentlessly evolving market demands.

That's especially true given the fact that your customer-facing mobile and web systems of engagement almost universally leverage your back-end mainframe systems of record.

So how do you actually get your mainframe environment up to speed? Given the fact that your existing mainframe dev/test processes and tools are pretty entrenched, how can you integrate the platform into a truly nimble and unified cross-platform enterprise DevOps environment?

Different organizations will take different approaches to this challenge. But here are three principles to bear in mind as you go about the difficult but ultimately extremely rewarding work of bringing your mainframe into the DevOps fold:

1. Transform the developer workspace

Most mainframe dev, test and ops work is still performed in "green screen" TSO/ISPF environments that require specialized knowledge, constrain productivity, and are extremely off-putting to the kind of skilled, ambitious programmers who are the lifeblood of Agile and DevOps transformation. It is therefore essential to migrate to more modern, graphical tools within a preferred DevOps toolchain that empower staff at all experience levels to perform mainframe tasks in much the same manner as they do other non-mainframe work.

Also, mainframe applications are typically large, complex and poorly documented. These attributes are a major impediment to mainframe transformation — and they tend to make enterprise IT highly dependent on the personal/tribal knowledge of senior mainframe staff.

To overcome the skills-and-knowledge gap, it's not enough to just make mainframe workspaces more graphical. You also need tools that enable new participants in mainframe DevOps to quickly and easily "read" existing application logic, program interdependencies, and data structures.

Recent innovations in mainframe workspace technology can also give developers on-the-fly feedback on any new bugs and quality issues they inject into their code. By investing in these tools, IT can empower even mainframe-inexperienced developers to quickly produce quality work that fits within the daily requirements of an Agile process. In addition, the latest mainframe development dashboard solutions enable managers to track defects, program complexity and technical debt so they can better pinpoint issues requiring additional coaching or training.

2. Remodel mainframe processes

Once you've built a better working environment for the mainframe, you can start to aggressively shift your process from a traditional waterfall model with large sets of requirements and long project timelines to a more incremental model that allows teams to quickly collaborate on so-called user "stories" and "epics." By estimating the size of these stories and assigning them their appropriate priority, your teams can start engaging in scrums that allow them to quickly iterate towards their goals.

The move from large-scale waterfall projects to Agile scrumming represents a significant change in work culture for most mainframe teams. Training in Agile process and work culture is therefore a must. You may also want to build your initial Agile mainframe team by choosing select mainframe developers and pairing them with Agile-experienced developers from other platforms to work collaboratively on user stories and epics.

You'll obviously also need the right enabling technologies for this shift. Key requirements include Agile project management software that supports Agile methodology — as well as Agile-enabled Source Code Management (SCM). The latter is especially pivotal since traditional mainframe SCM environments are inherently designed for waterfall development, and are thus incapable of providing essential Agile capabilities — such as parallel development work on user stories.

When engaged in this re-tooling, it is generally wiser to leverage best-in-class tools rather than fall into a monolithic approach that requires all SDLC activities to be performed within a single vendor's solution set. That's because best-in-class tools allow you to avoid vendor lock-in while taking advantage of the latest innovations in Agile management.

3. Integrate mainframe workflows into the cross-platform enterprise DevOps toolchain

The target state of mainframe transformation is ultimately a de-siloed enterprise DevOps environment where the mainframe is "just another platform" — albeit an especially scalable, reliable, high-performing, cost-efficient and secure one — that can be quickly and appropriately modified as needed to meet the needs of the business by whoever is available to do so.

This requires integration between mainframe and distributed tools (typically via REST APIs) so that DevOps teams have a single point-of-control for all changes across z/OS, Windows, Unix and other platforms. An effective cross-platform toolchain will also provide cross-platform impact analysis — so your developers can see how the code they're working on in one tier of an application (e.g. a mobile app server) may potentially affect another application tier (e.g. a DB2 database).

The de-siloing of your mainframe can also lead to unified IT service management (ITSM) for both mainframe and non-mainframe applications. This unified ITSM model is for companies with large numbers of multi-tier applications that are critical to their financial performance.

Of course, it takes budget, hard work and strong leadership to turn these principles into in-the-trenches realities. It is important for mainframe users to align themselves with others that don't just pay lip service to bringing the mainframe into the DevOps fold, and are committed to actually doing what it takes to get it done, and have experience doing it within their own organizations. Getting the mainframe up to DevOps speed is possible — and is being done by enterprises that recognize the need to complement their existing advantages of scale with new advantages of speed. And the alternative of "bimodal," "two-speed," or "multi-speed" IT is simply untenable. 

Every company with a mainframe therefore needs to get with the mainframe DevOps program. Now.

Chris O'Malley is CEO of Compuware.

The Latest

May 25, 2017

DevOps brings Development and Operations together with the sheer objective of ensuring quality and enabling faster time to market. However, what happens to QA in this scenario? How does the Testing team fit in? Let's ponder on this further and understand the role of QA and Testing in the DevOps world ...

May 23, 2017

When organizations adopt containers and microservice style architectures in production, systems become incredibly complex. For operations it's a shock because it means coming to grips with many new container tech nuances - plus letting go of the old monitoring rule book ...

May 22, 2017

Managing application performance today requires analytics. IT Operations Analytics (ITOA) is often used to augment or built into Application Performance Management solutions to process the massive amounts of metrics coming out of today's IT environment. But today ITOA stands at a crossroads as revolutionary technologies and capabilities are emerging to push it into new realms. So where is ITOA going next? With this question in mind, DEVOPSdigest partner site APMdigest asked experts across the industry for their opinions on the next steps for ITOA ...

May 18, 2017

In Part 3 of my Q&A with industry analysts, I ask: What trends will have the biggest impact on the software industry and DevOps in particular this year and beyond? How can enterprises set themselves up to succeed with so many rapid changes occurring in development and delivery? ...

May 16, 2017

See how to turbo-charge the impact of APIs, according to a global study conducted by CA Technologies ...

May 15, 2017

APIs are vital components for business success and thriving in the application economy, according to a global study conducted by CA Technologies ...

May 11, 2017

In Part 2, I'll dive into some findings from CollabNet's outreach to industry analysts. I asked them about the greatest challenges facing enterprises venturing into the world of DevOps and to touch on what lies ahead for the future of the DevOps movement ...

May 09, 2017

DevOps teams bring significant benefits to their organizations. Unfortunately, DevOps teams, like many business programs, tend to believe innovation must come with a detriment to security. Security measures are often seen as obstacles that impact the agility that DevOps teams rely on ...

May 08, 2017

With increased competition, enterprises now require greater agility than ever before, and traditional approaches simply can’t provide the speed enterprises demand. To remain competitive with these new players, companies need to improve their operational agility both in the data center and the WAN ...

May 04, 2017

There is no "right" culture for DevOps, but characteristics such as open communication, high cooperation, collaboration, respect, and trust are essential. If your organization does not have these characteristics, they must be developed. Culture is learned, not inherited. It must be genuinely nurtured by everyone from executive management on down the line. Here are some hacks to help develop a positive DevOps culture ...

Share this