Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 2
May 12, 2020

Rich Weber
Panzura

There are multiple stages that the organization can take to adapt to the new normal of WFH. A new normal that will change IT forever — the way it is used, implemented, and valued. This is the frontline where remote working solutions and cloud platforms and technology solutions will be forged in the flames of necessity and demand.

Start with: Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 1

Start with the data

Data has gravity. It takes a long time to move and manage. So, you need solutions that make getting the data into the cloud easy by moving there in steps. Consider a hybrid model that puts the data in the cloud but accesses it through on-prem technology. A hybrid model can also allow the business to move selective workflows, applications, or users into the cloud gradually. It doesn't have to be a petabyte of data dumped into the cloud in one fell swoop, and it can be steadily moved one byte at a time. But the first bytes must be dictated by a clearly defined strategy that allows the business to move everything else eventually.

Preserve your workflow

Don't change your security paradigm or how users access and authenticate the data. This is absolutely critical, as is ensuring that it is secure. It has to be locked down, encrypted, and has to factor in the risks.

Ensure data availability

Business users work with unstructured data, and the challenge is to ensure that they have access to the same data at home as they did in the office. This requires getting the data into the cloud and using technology that pushes it closer to the end user. This can be a site or a cloud region that's critical for getting the user data. Then, you need to extend those desktops to a home or remote location in a performant way. The data has to be a ubiquitous access layer that allows for data to be accessible across multiple geographies and time zones. This means that whether the demand for data is a simple document retrieval or massive design file from a high-performance collaborative workstation, the performance is the same.

First, consider cloud VDI to manage the high-performance requirements as it allows you to extend a powerful workstation to a tablet in a coffee shop. The technology is there. To make the data a ubiquitous access layer, you can use a cloud file solution that makes the same data accessible in real time. It’s a combination of taking advantage of technology and leveraging it to create the work sweet spot. You have to make access to data fast, or you'll only solve one problem while creating others such as data collisions and difficulties with search.

Put a filer into any compute cloud

This allows users and applications to have access to the same data in any compute cloud. When you move application and workflows into the cloud, that cloud is no different from a hybrid on-premise site. When the enterprise reads the data between the clouds, it's the same as reading it over the internet, which can incur high costs across storage and usage. A filer caches everything locally, which means they can remove the need to do a remote cloud read, which immediately saves money on charges and reduces latency.

Reconsider your reluctance when it comes to a cloud-first strategy

If you didn't have this strategy, to begin with, if you said the sun always shines at your company, you may have to reconsider and start implementing post-haste.

WFH Status: It’s Complicated

Remote working isn't new. Traveling workers, full-time remote workers, part-time telecommuters — these roles have been steadily evolving and compounding year-on-year because organizations could see the advantages in terms of access to talent and employee productivity. However, until recently, most companies didn't have 100% of their workforce working from home, as they do today. Maybe 10-20% were granted that golden ticket. The infrastructure was in place for this 20%, and few corporate business continuity plans thought — what will happen if we send everybody at every global location home at the same time?

Why would they? Disasters are typically localized. Today, this has fundamentally changed. Today, the business has to look at its continuity plan and say, "I need a contingency for a global shut down because this can happen again."

However, building that contingency to support 100% of the workforce changes the investment parameters. The business has had to ensure that its entire workforce can work from home and has invested in resources that allow for it. Now, what happens when the pandemic subsides? If the business drops back to 20% remote working, then it's a sunk investment.

Companies that prove the WFH model works are very likely to now adopt progressive remote working plans that leverage the architecture and the benefits that working from home brings. It may seem a dire and costly outlook in light of the economy and lost income, but these investments can help organizations save money. If integration is accessible and replication designed for redundancy and data consolidated intelligently, then your business has invested in resiliency and technology that will pay for itself.

Rich Weber is President of Panzura
Share this

Industry News

May 19, 2022

Jellyfish announced the launch of Jellyfish Benchmarks, a way to add context around engineering metrics and performance by introducing a method for comparison.

May 19, 2022

Solo.io announced the addition and integration of Cilium networking into its Gloo Mesh platform, providing a complete application-networking solution for companies’ cloud-native digital transformation efforts.

May 19, 2022

Aqua Security announced multiple updates to Aqua Trivy, making it a unified scanner for cloud native security.

May 18, 2022

Red Hat unveiled updates across its portfolio of developer tools designed to help organizations build and deliver applications faster and more consistently across Kubernetes-based hybrid and multicloud environments.

May 18, 2022

Armory announced public early access to their new Continuous Deployment-as-a-Service product.

May 18, 2022

DataCore Software announced DataCore Bolt, enterprise-grade container-native storage software for DevOps.

May 17, 2022

DevOps Institute, a global professional association for advancing the human elements of DevOps, announced the release of the Upskilling IT 2022 report.

May 17, 2022

Replicated announced a host of new platform features and capabilities that enable their customers to accelerate enterprise adoption of their Kubernetes applications.

May 17, 2022

Codefresh announced that its flagship continuous delivery (CD) platform will be made accessible as a fully-hosted solution for DevOps teams seeking to quickly and easily achieve frictionless, GitOps-based continuous software delivery in the cloud.

May 16, 2022

Red Hat announced new capabilities and enhancements across its portfolio of open hybrid cloud solutions aimed at accelerating enterprise adoption of edge compute architectures through the Red Hat Edge initiative.

May 16, 2022

D2iQ announced a partnership with GitLab.

May 16, 2022

Kasten by Veeam announced the new Kasten by Veeam K10 V5.0 Kubernetes data management platform.

May 12, 2022

Red Hat introduced Red Hat Enterprise Linux 9, the Linux operating system designed to drive more consistent innovation across the open hybrid cloud, from bare metal servers to cloud providers and the farthest edge of enterprise networks.

May 12, 2022

Couchbase announced version 7.1 of Couchbase Server.

May 12, 2022

Copado added Copado Robotic Testing to Copado Essentials.