Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 2
May 12, 2020

Rich Weber
Panzura

There are multiple stages that the organization can take to adapt to the new normal of WFH. A new normal that will change IT forever — the way it is used, implemented, and valued. This is the frontline where remote working solutions and cloud platforms and technology solutions will be forged in the flames of necessity and demand.

Start with: Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 1

Start with the data

Data has gravity. It takes a long time to move and manage. So, you need solutions that make getting the data into the cloud easy by moving there in steps. Consider a hybrid model that puts the data in the cloud but accesses it through on-prem technology. A hybrid model can also allow the business to move selective workflows, applications, or users into the cloud gradually. It doesn't have to be a petabyte of data dumped into the cloud in one fell swoop, and it can be steadily moved one byte at a time. But the first bytes must be dictated by a clearly defined strategy that allows the business to move everything else eventually.

Preserve your workflow

Don't change your security paradigm or how users access and authenticate the data. This is absolutely critical, as is ensuring that it is secure. It has to be locked down, encrypted, and has to factor in the risks.

Ensure data availability

Business users work with unstructured data, and the challenge is to ensure that they have access to the same data at home as they did in the office. This requires getting the data into the cloud and using technology that pushes it closer to the end user. This can be a site or a cloud region that's critical for getting the user data. Then, you need to extend those desktops to a home or remote location in a performant way. The data has to be a ubiquitous access layer that allows for data to be accessible across multiple geographies and time zones. This means that whether the demand for data is a simple document retrieval or massive design file from a high-performance collaborative workstation, the performance is the same.

First, consider cloud VDI to manage the high-performance requirements as it allows you to extend a powerful workstation to a tablet in a coffee shop. The technology is there. To make the data a ubiquitous access layer, you can use a cloud file solution that makes the same data accessible in real time. It’s a combination of taking advantage of technology and leveraging it to create the work sweet spot. You have to make access to data fast, or you'll only solve one problem while creating others such as data collisions and difficulties with search.

Put a filer into any compute cloud

This allows users and applications to have access to the same data in any compute cloud. When you move application and workflows into the cloud, that cloud is no different from a hybrid on-premise site. When the enterprise reads the data between the clouds, it's the same as reading it over the internet, which can incur high costs across storage and usage. A filer caches everything locally, which means they can remove the need to do a remote cloud read, which immediately saves money on charges and reduces latency.

Reconsider your reluctance when it comes to a cloud-first strategy

If you didn't have this strategy, to begin with, if you said the sun always shines at your company, you may have to reconsider and start implementing post-haste.

WFH Status: It’s Complicated

Remote working isn't new. Traveling workers, full-time remote workers, part-time telecommuters — these roles have been steadily evolving and compounding year-on-year because organizations could see the advantages in terms of access to talent and employee productivity. However, until recently, most companies didn't have 100% of their workforce working from home, as they do today. Maybe 10-20% were granted that golden ticket. The infrastructure was in place for this 20%, and few corporate business continuity plans thought — what will happen if we send everybody at every global location home at the same time?

Why would they? Disasters are typically localized. Today, this has fundamentally changed. Today, the business has to look at its continuity plan and say, "I need a contingency for a global shut down because this can happen again."

However, building that contingency to support 100% of the workforce changes the investment parameters. The business has had to ensure that its entire workforce can work from home and has invested in resources that allow for it. Now, what happens when the pandemic subsides? If the business drops back to 20% remote working, then it's a sunk investment.

Companies that prove the WFH model works are very likely to now adopt progressive remote working plans that leverage the architecture and the benefits that working from home brings. It may seem a dire and costly outlook in light of the economy and lost income, but these investments can help organizations save money. If integration is accessible and replication designed for redundancy and data consolidated intelligently, then your business has invested in resiliency and technology that will pay for itself.

Rich Weber is President of Panzura
Share this

Industry News

June 04, 2020

Exadel announced Appery.io, its low code development platform, now offers new subscription tiers.

June 04, 2020

NetApp has entered into a definitive agreement to acquire Spot, a provider of compute management and cost optimization on the public clouds, to establish leadership in Application Driven Infrastructure.

June 04, 2020

Fluree announced the release of the Fluree JavaScript Library, a feature included in Fluree V0.13.0 that enables developers to launch a version of Fluree as an in-memory, code-resident data source for front-end applications, enabling sub-millisecond data delivery.

June 03, 2020

Netcracker Technology announced the launch of its Netcracker 2020 portfolio to help service providers focus on their customer’s digital lifestyle.

June 03, 2020

Navisite announced its acquisition of Privo, a Premier Consulting Partner in the Amazon Web Services (AWS) Partner Network (APN).

June 03, 2020

Grafana Labs released Grafana 7.0 with significant enhancements to simplify the development of custom plugins and drastically increase the power, speed and flexibility of visualization.

June 02, 2020

Chef announced a number of new products designed to enable coded enterprises to work across silos to build competitive advantage through automation.

June 02, 2020

Rancher Labs announced the general availability of Longhorn, an enterprise-grade, cloud-native container storage solution.

June 02, 2020

Checkmarx announced the launch of Checkmarx SCA (CxSCA), the company’s new, SaaS-based software composition analysis solution.

June 01, 2020

IT Revolution announced a full conference agenda for DevOps Enterprise Summit London, June 23-25, 2020.

June 01, 2020

Caltech CTME announced that Simplilearn, a global provider of digital skills training, will collaborate with CTME (Caltech's Center for Technology and Management Education) to offer a specialized Post Graduate Program in DevOps software engineering.

June 01, 2020

DevOps Institute, a global member-based association for advancing the human elements of DevOps, announced the introduction of its SKILup Playbook Library, a dynamic collective body of knowledge (cBok) that aligns thought leadership from industry experts with a set of dynamic, orchestrated artifacts, research and assets.

May 28, 2020

Docker has extended its strategic collaboration with Microsoft to simplify code to cloud application development for developers and development teams by more closely integrating with Azure Container Instances (ACI).

May 28, 2020

Eggplant announced updates to its Digital Automation Intelligence (DAI) platform.

May 28, 2020

Aptum launched its Managed DevOps Service in partnership with CloudOps, a cloud consulting and professional services company specializing in DevOps.