Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 2
May 12, 2020

Rich Weber
Panzura

There are multiple stages that the organization can take to adapt to the new normal of WFH. A new normal that will change IT forever — the way it is used, implemented, and valued. This is the frontline where remote working solutions and cloud platforms and technology solutions will be forged in the flames of necessity and demand.

Start with: Rebuilding the Post-Pandemic Architecture for Remote Workers - Part 1

Start with the data

Data has gravity. It takes a long time to move and manage. So, you need solutions that make getting the data into the cloud easy by moving there in steps. Consider a hybrid model that puts the data in the cloud but accesses it through on-prem technology. A hybrid model can also allow the business to move selective workflows, applications, or users into the cloud gradually. It doesn't have to be a petabyte of data dumped into the cloud in one fell swoop, and it can be steadily moved one byte at a time. But the first bytes must be dictated by a clearly defined strategy that allows the business to move everything else eventually.

Preserve your workflow

Don't change your security paradigm or how users access and authenticate the data. This is absolutely critical, as is ensuring that it is secure. It has to be locked down, encrypted, and has to factor in the risks.

Ensure data availability

Business users work with unstructured data, and the challenge is to ensure that they have access to the same data at home as they did in the office. This requires getting the data into the cloud and using technology that pushes it closer to the end user. This can be a site or a cloud region that's critical for getting the user data. Then, you need to extend those desktops to a home or remote location in a performant way. The data has to be a ubiquitous access layer that allows for data to be accessible across multiple geographies and time zones. This means that whether the demand for data is a simple document retrieval or massive design file from a high-performance collaborative workstation, the performance is the same.

First, consider cloud VDI to manage the high-performance requirements as it allows you to extend a powerful workstation to a tablet in a coffee shop. The technology is there. To make the data a ubiquitous access layer, you can use a cloud file solution that makes the same data accessible in real time. It’s a combination of taking advantage of technology and leveraging it to create the work sweet spot. You have to make access to data fast, or you'll only solve one problem while creating others such as data collisions and difficulties with search.

Put a filer into any compute cloud

This allows users and applications to have access to the same data in any compute cloud. When you move application and workflows into the cloud, that cloud is no different from a hybrid on-premise site. When the enterprise reads the data between the clouds, it's the same as reading it over the internet, which can incur high costs across storage and usage. A filer caches everything locally, which means they can remove the need to do a remote cloud read, which immediately saves money on charges and reduces latency.

Reconsider your reluctance when it comes to a cloud-first strategy

If you didn't have this strategy, to begin with, if you said the sun always shines at your company, you may have to reconsider and start implementing post-haste.

WFH Status: It’s Complicated

Remote working isn't new. Traveling workers, full-time remote workers, part-time telecommuters — these roles have been steadily evolving and compounding year-on-year because organizations could see the advantages in terms of access to talent and employee productivity. However, until recently, most companies didn't have 100% of their workforce working from home, as they do today. Maybe 10-20% were granted that golden ticket. The infrastructure was in place for this 20%, and few corporate business continuity plans thought — what will happen if we send everybody at every global location home at the same time?

Why would they? Disasters are typically localized. Today, this has fundamentally changed. Today, the business has to look at its continuity plan and say, "I need a contingency for a global shut down because this can happen again."

However, building that contingency to support 100% of the workforce changes the investment parameters. The business has had to ensure that its entire workforce can work from home and has invested in resources that allow for it. Now, what happens when the pandemic subsides? If the business drops back to 20% remote working, then it's a sunk investment.

Companies that prove the WFH model works are very likely to now adopt progressive remote working plans that leverage the architecture and the benefits that working from home brings. It may seem a dire and costly outlook in light of the economy and lost income, but these investments can help organizations save money. If integration is accessible and replication designed for redundancy and data consolidated intelligently, then your business has invested in resiliency and technology that will pay for itself.

Rich Weber is President of Panzura
Share this

Industry News

October 26, 2020

NetApp unveiled a new serverless and storageless solution for containers from Spot by NetApp, a new autonomous hybrid cloud volume platform, and cloud-based virtual desktop solutions.

October 26, 2020

GeneXus released GeneXus 17, a new version of its platform that empowers enterprises to create and evolve new applications at unprecedented speed.

October 26, 2020

Alcide announced the company’s security solutions are now integrated with AWS Security Hub, sending real-time threat intelligence and compliance information to Amazon Web Services (AWS) for easy consumption by Security and DevSecOps teams.

October 22, 2020

Puppet announced Puppet Comply, a new product built to work with Puppet Enterprise aimed at assessing, remediating, and enforcing infrastructure configuration compliance policies at scale across traditional and cloud environments.

October 22, 2020

Harness announced two new modules: Continuous Integration Enterprise and Continuous Features.

October 22, 2020

Render announced automatic preview environments which are essential for rapid and collaborative development of modern applications.

October 21, 2020

Conducto is launching a toolkit for simplifying complex CI/CD and data science pipelines, having raised $3 million in seed funding led by Jump Capital.

October 21, 2020

Snyk Intel vulnerability database will be integrated into IBM Cloud security capabilities to enhance security for enterprise workloads.

October 21, 2020

Accurics announced $20 million across seed and series A financing raised in the past six months, with Intel Capital leading the Series A and ClearSky leading the seed.

October 20, 2020

Splunk announced the Splunk Observability Suite, the most comprehensive and powerful combination of monitoring, investigation, and troubleshooting solutions designed to help organizations become cloud-ready and accelerate their digital transformation.

October 20, 2020

Tricentis announced Vision AI, the core technology that will now power Tosca.

October 20, 2020

MuseDev has extended its code analysis platform to deliver bug reports via Github's code scanning UI.

October 20, 2020

Digital Shadows announced the ability to detect exposed access keys.

October 19, 2020

StackRox and Robin.io announced a new partnership bringing together Robin’s application-focused approach to Kubernetes data management with StackRox’s Kubernetes-native security and compliance capabilities.

October 19, 2020

PubNub announced new Chat UI Kits to streamline chat development.