DNS Load Balancing is DevOps' Secret Weapon
October 04, 2017

Steven Job
Tiggee

Load balancing at the DNS (Domain Name System) level has been around for a few decades now, but it didn't become crucial until recently as technology is moving to the cloud. DNS is the perfect solution for managing cloud systems because it operates independently of hosting providers — meaning DNS records can be configured to manipulate how much and what kinds of traffic reach certain endpoints through a third party provider.

With the growth of cloud-based services, infrastructure is more commonly managed as code rather than in a data center. That means you can alter a single DNS record and potentially knock your application or website offline. This has actually happened a few times.

Conversely, you can leverage DNS records to optimize traffic flowing to your domains or servers. GeoDNS and network monitoring can supercharge your traditional DNS management, paving the way for automated DNS management.

Automated Load Balancing

The latest craze in both SaaS and DevOps has been automation, from chatbots to task automation. The DNS industry has been offering basic automation for roughly a decade now in the form of DNS failover. This service automatically reroutes traffic away from non-responsive endpoints to healthy ones.

DNS load balancing uses similar techniques to test the availability and performance of endpoints. But load balancing also allows you to send traffic to more than one endpoint simultaneously. You can even set different weights for each endpoint. Load balancing is commonly used by organizations that want to use more than one vendor, say for a multi-CDN implementation.

This method offers the flexibility to use more than one provider and take advantage of different service offerings. For example, you may want a particular CDN for video streaming but they don't perform well in some regions. You can use DNS load balancing to serve vendors only where they perform the strongest.

You can even use load balancing to cut costs! Most vendors charge drastically different prices depending on the region, but you can work around it if you create location-specific rules that favor lower cost providers. When you use more than one vendor, you also reduce the risk of single provider outages.

Cloud Migration

Load balancing is a viable asset during migrations, whether you're moving to more cloud-based systems or rolling out something new.

A well-planned strategy can ensure you maintain availability and limit performance degradation during the migration. You can use record pools, which are groups of endpoints that are served to users, and slowly increase the traffic sent to your cloud endpoints. If something goes wrong, only a subset of your end-users will be affected, and you can easily roll back your changes to a previous version.

Roll Out

You can use the same strategy we just mentioned but combined with GeoDNS features to slowly roll out an application or feature to new audiences. GeoDNS services like GeoProximity and IP Filters allow you to create unique rules that dictate how your end-users are answered based on their location, ASN, or IP address.

Let's say you have a new app you want to roll out to your US users and then to your Europeans users. You can create an IP Filter for US-based users that returns the box where the application is stored. Just make sure you have a rule for "world" applied to a record that sends users to a different endpoint.

The Big Picture

As the internet grows, the world gets smaller and organizations need to maintain performance no matter where their end-users are. DNS load balancing offers easy scalability and unparalleled customization. Now is the best time for DevOps to begin implementation, before the demand catches up.

Steven Job is President and Founder of Tiggee, the parent company of DNS Made Easy and Constellix
Share this

Industry News

April 23, 2024

mabl announced the addition of mobile application testing to its platform.

April 23, 2024

Spectro Cloud announced the achievement of a new Amazon Web Services (AWS) Competency designation.

April 22, 2024

GitLab announced the general availability of GitLab Duo Chat.

April 18, 2024

SmartBear announced a new version of its API design and documentation tool, SwaggerHub, integrating Stoplight’s API open source tools.

April 18, 2024

Red Hat announced updates to Red Hat Trusted Software Supply Chain.

April 18, 2024

Tricentis announced the latest update to the company’s AI offerings with the launch of Tricentis Copilot, a suite of solutions leveraging generative AI to enhance productivity throughout the entire testing lifecycle.

April 17, 2024

CIQ launched fully supported, upstream stable kernels for Rocky Linux via the CIQ Enterprise Linux Platform, providing enhanced performance, hardware compatibility and security.

April 17, 2024

Redgate launched an enterprise version of its database monitoring tool, providing a range of new features to address the challenges of scale and complexity faced by larger organizations.

April 17, 2024

Snyk announced the expansion of its current partnership with Google Cloud to advance secure code generated by Google Cloud’s generative-AI-powered collaborator service, Gemini Code Assist.

April 16, 2024

Kong announced the commercial availability of Kong Konnect Dedicated Cloud Gateways on Amazon Web Services (AWS).

April 16, 2024

Pegasystems announced the general availability of Pega Infinity ’24.1™.

April 16, 2024

Sylabs announces the launch of a new certification focusing on the Singularity container platform.

April 15, 2024

OpenText™ announced Cloud Editions (CE) 24.2, including OpenText DevOps Cloud and OpenText™ DevOps Aviator.

April 15, 2024

Postman announced its acquisition of Orbit, the community growth platform for developer companies.

April 11, 2024

Check Point® Software Technologies Ltd. announced new email security features that enhance its Check Point Harmony Email & Collaboration portfolio: Patented unified quarantine, DMARC monitoring, archiving, and Smart Banners.