How to Prepare Data Now for a Cloud Migration in the Future
November 15, 2021

Tony Perez

Migrating applications to the cloud is a complex process. It becomes even more complicated if the applications in question are business-critical, have been heavily customized, or run on IBM POWER hardware. Fortunately, IT can reduce risk during the migration process by planning and preparing data and applications in advance. Here's how.

Planning for a Future Migration

Before getting sucked into the details of a cloud migration, IT should specify the goals for that migration and ensure that they line up with larger business initiatives or digital transformation projects. These goals might be reducing data center costs, improving business agility, modernizing software development processes, or improving disaster recovery and backup options.

Next, they should choose their overall approach to the migration. Is it better to redesign software entirely, or lift and shift it to the cloud "as-is?"

While redesigning software to use cloud-native components is usually necessary to get the full flexibility and scalability benefits of the cloud, it might make more sense to lift and shift for scenarios like disaster recovery or migrating development workloads. IBM i and AIX applications (which are based on PowerPC) used to be incompatible with the cloud without refactoring them to use x86 components, which added a great deal of time, work and risk to the process. In practice, that usually resulted in these applications staying locked in the data center.

Over the last few years, however, solutions have emerged to lift and shift these applications to the cloud as well, reducing the risk and complexity of such a migration.

A third option is to lift and shift applications to the cloud initially, and then refactor them slowly, piece by piece, instead of all at once. This typically decreases the risk and complexity of the migration and allows organizations to get some cloud benefits while taking their time to refactor.

6 Steps to Prepare Data for the Cloud

Once these important decisions have been made, IT can get to work on the nuts and bolts of the transition. Here is a series of steps to prepare data and applications for cloud migration.

1. Assess existing workloads- First, IT should examine all workloads to be moved to the cloud and determine their technical and business requirements. The most important of these is the capacity they will need (always-on, bursting, variable or pay-as-you-go). There is a trade-off here between cost and capacity, so selecting lower capacity options where possible will save money.

2. Pick data center locations- Next, IT should decide which data centers their workloads should be run in. Compliance requirements, latency, or a need for redundancy within a region are all factors to consider, particularly for disaster recovery and backup workloads.

3. Measure workload sizing and capacity- Figure out the scale, OS and external connections that each workload will require. A cloud provider can often help measure CPU, memory and storage requirements. Using cloud infrastructure makes it possible to eliminate idle resources and over-provisioning, resulting in a significant reduction of concurrently running LPARs or virtual machines.

4. Check licensing compliance– Licensing compliance in the cloud can be difficult; licenses for major applications like ERP systems are often not portable or are tied to a specific hardware serial number or to the ID of a logical partition. If this is the case for your workloads, your cloud provider may offer workarounds, such as VM Host pinning.

5. Plan how to transfer the data– This will vary depending on the workloads in question, timeline, and volume of data that needs to be migrated. There are many options to choose from for actually moving data to the cloud, including secure FTP, database replication, an encrypted physical hard drive, or cloud features like Azure Express Route or IBM Cloud Mass Data Migration.

6. Plan how to manage access- Consider how to manage access to the cloud once the migration is complete. Users and teams can easily create their own environments using pre-built templates, rather than relying on a cloud team to build it for them (this has the added benefit of eliminating bottlenecks and speeding up release cycles considerably). If management has concerns about lack of oversight leading to unanticipated costs, it might make sense to set up usage quotas for groups or departments and track usage to ensure no one runs up the bill by accident.

As with all large projects, resist the urge to speed through the planning and preparation phases of a cloud migration — this is vital to ensuring that the whole project runs smoothly. Working through these steps will set IT up for success, no matter how complex the applications being migrated are.

Tony Perez is a Cloud Solutions Architect at Skytap
Share this

Industry News

February 29, 2024

ManageEngine, the enterprise IT management division of Zoho Corporation, announced the integration between Endpoint Central, its flagship unified endpoint management solution, and Check Point's Harmony Mobile, a leading mobile threat defense solution, to help IT security teams automate the remediation of mobile threats.

February 29, 2024

Stack Overflow and Google Cloud announced a strategic partnership that will deliver new gen AI-powered capabilities to developers through the Stack Overflow platform, Google Cloud Console, and Gemini for Google Cloud.

February 29, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the graduation of Falco, a cloud native security tool designed for Linux systems and the de facto Kubernetes threat detection engine.

February 28, 2024

JFrog announced a new technology integration with Qwak, a fully managed ML Platform, that brings machine learning models alongside traditional software development processes to streamline, accelerate, and scale the secure delivery of ML applications.

February 28, 2024

ServiceNow, Hugging Face, and NVIDIA, announced the release of StarCoder2, a family of open‑access large language models (LLMs) for code generation that sets new standards for performance, transparency, and cost‑effectiveness.

February 28, 2024

GMO GlobalSign announced the availability of an Issuer for Kubernetes cert-manager.

February 27, 2024

MacStadium announced the launch of its online community to deepen the connections of application developers through knowledge sharing and collaboration.

February 27, 2024

Octopus Deploy announced the acquisition of Codefresh Inc.

February 26, 2024

Intel announced its new Edge Platform, a modular, open software platform enabling enterprises to develop, deploy, run, secure, and manage edge and AI applications at scale with cloud-like simplicity.

February 26, 2024 announced AI-augmented API Management, a new Tray Universal Automation Cloud capability that turns any new or existing workflow into a reusable API, significantly decreasing the technical debt associated with the operational effort and costs of traditional API management (APIM).

February 26, 2024

Bitwarden Secrets Manager is now integrated with Ansible Playbook.

February 22, 2024

Check Point® Software Technologies Ltd. introduces Check Point Quantum Force series: an innovative lineup of ten high-performance firewalls designed to meet and exceed the stringent security demands of enterprise data centers, network perimeters, campuses, and businesses of all dimensions.

February 22, 2024

Tabnine announced that Tabnine Chat — the enterprise-grade, code-centric chat application that allows developers to interact with Tabnine AI models using natural language — is now available to all users.

February 22, 2024

Avaamo released Avaamo LLaMB™, a new low-code framework for building generative AI applications in the enterprise safely, securely, and fast.

February 21, 2024

CAST announced the winter release of CAST Imaging, an imaging system for software applications, with significant user experience (UX) enhancements and new features designed to simplify and accelerate processes for engineers who develop, maintain, modernize, complex software applications.