Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 2
May 12, 2020

Rich Weber
Panzura

There are multiple stages that the organization can take to adapt to the new normal of WFH. A new normal that will change IT forever — the way it is used, implemented, and valued. This is the frontline where remote working solutions and cloud platforms and technology solutions will be forged in the flames of necessity and demand.

Start with: Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 1

Start with the data

Data has gravity. It takes a long time to move and manage. So, you need solutions that make getting the data into the cloud easy by moving there in steps. Consider a hybrid model that puts the data in the cloud but accesses it through on-prem technology. A hybrid model can also allow the business to move selective workflows, applications, or users into the cloud gradually. It doesn't have to be a petabyte of data dumped into the cloud in one fell swoop, and it can be steadily moved one byte at a time. But the first bytes must be dictated by a clearly defined strategy that allows the business to move everything else eventually.

Preserve your workflow

Don't change your security paradigm or how users access and authenticate the data. This is absolutely critical, as is ensuring that it is secure. It has to be locked down, encrypted, and has to factor in the risks.

Ensure data availability

Business users work with unstructured data, and the challenge is to ensure that they have access to the same data at home as they did in the office. This requires getting the data into the cloud and using technology that pushes it closer to the end user. This can be a site or a cloud region that's critical for getting the user data. Then, you need to extend those desktops to a home or remote location in a performant way. The data has to be a ubiquitous access layer that allows for data to be accessible across multiple geographies and time zones. This means that whether the demand for data is a simple document retrieval or massive design file from a high-performance collaborative workstation, the performance is the same.

First, consider cloud VDI to manage the high-performance requirements as it allows you to extend a powerful workstation to a tablet in a coffee shop. The technology is there. To make the data a ubiquitous access layer, you can use a cloud file solution that makes the same data accessible in real time. It’s a combination of taking advantage of technology and leveraging it to create the work sweet spot. You have to make access to data fast, or you'll only solve one problem while creating others such as data collisions and difficulties with search.

Put a filer into any compute cloud

This allows users and applications to have access to the same data in any compute cloud. When you move application and workflows into the cloud, that cloud is no different from a hybrid on-premise site. When the enterprise reads the data between the clouds, it's the same as reading it over the internet, which can incur high costs across storage and usage. A filer caches everything locally, which means they can remove the need to do a remote cloud read, which immediately saves money on charges and reduces latency.

Reconsider your reluctance when it comes to a cloud-first strategy

If you didn't have this strategy, to begin with, if you said the sun always shines at your company, you may have to reconsider and start implementing post-haste.

WFH Status: It’s Complicated

Remote working isn't new. Traveling workers, full-time remote workers, part-time telecommuters — these roles have been steadily evolving and compounding year-on-year because organizations could see the advantages in terms of access to talent and employee productivity. However, until recently, most companies didn't have 100% of their workforce working from home, as they do today. Maybe 10-20% were granted that golden ticket. The infrastructure was in place for this 20%, and few corporate business continuity plans thought — what will happen if we send everybody at every global location home at the same time?

Why would they? Disasters are typically localized. Today, this has fundamentally changed. Today, the business has to look at its continuity plan and say, "I need a contingency for a global shut down because this can happen again."

However, building that contingency to support 100% of the workforce changes the investment parameters. The business has had to ensure that its entire workforce can work from home and has invested in resources that allow for it. Now, what happens when the pandemic subsides? If the business drops back to 20% remote working, then it's a sunk investment.

Companies that prove the WFH model works are very likely to now adopt progressive remote working plans that leverage the architecture and the benefits that working from home brings. It may seem a dire and costly outlook in light of the economy and lost income, but these investments can help organizations save money. If integration is accessible and replication designed for redundancy and data consolidated intelligently, then your business has invested in resiliency and technology that will pay for itself.

Rich Weber is President of Panzura
Share this

Industry News

April 25, 2024

JFrog announced a new machine learning (ML) lifecycle integration between JFrog Artifactory and MLflow, an open source software platform originally developed by Databricks.

April 25, 2024

Copado announced the general availability of Test Copilot, the AI-powered test creation assistant.

April 25, 2024

SmartBear has added no-code test automation powered by GenAI to its Zephyr Scale, the solution that delivers scalable, performant test management inside Jira.

April 24, 2024

Opsera announced that two new patents have been issued for its Unified DevOps Platform, now totaling nine patents issued for the cloud-native DevOps Platform.

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.