Leveraging Java Virtual Machines for Cloud-Native Enterprises
November 02, 2022

Simon Ritter

For at least the last ten years, every year was the year of Java's demise. Still, Java is here and it is as strong as ever. Not only that, but it is also very present together with state-of-the-art technologies, such as clouds, containers and resource managers.

However, running Java, or any other JVM language, in the cloud comes with hurdles. Microservice architectures, quick scaling operations (scale up or down) to balance load requirements and cost effectiveness, as well as environments such as Kubernetes (k8s) come with their own complexity in terms of Java.

Java and the Cloud

That said, one of the biggest obstacles with Java-based microservices is the slow startup time of Java applications. A lot of this can be mitigated by using microservice frameworks like Quarkus, but the JVM still has heavy lifting with class loading and initial Just-in-Time (JIT) compilation.

When talking about startup time and Java, it would be more accurate to say warmup time. Warming up a JVM means to feed real-world workloads to the application to give the JVM enough profiling data to optimize the application for highest speed with that specific performance profile. This adds to the loading time of the JVM and application itself, and it happens every single time you start the application. Even if you already have an instance of the same application running.

It also works against the newly acquired "Let it crash" mentality. For decades we, as engineers, learned to handle exception and failure scenarios. Everyone knows how hard it can be to recover from these issues and get the application back into a meaningful and consistent state. The opposite of that sentiment is to just let the application crash and restart it. With the JVM, however, this brings us right back to the warmup time issue.

Likewise, we see a similar problematic situation with quickly and / or frequently scaling up and down. When running in the cloud, one of the major cost efficiency factors is to keep the provided service resources consistent with the current load profile. That means scaling up if we see higher loads and scaling down if the load profile slows down. Introducing an additional instance of our Java application will cause a delay. The warmup delay can be just a minute, but some applications can take much longer to warm up and be ready to serve requests at full speed.

The Cloud Native Compiler (CNC) Makes Compilation More Efficient

This issue can be mitigated through sharing the code and JIT compilation results by moving the compiler outside of the JVM. Such compilers exist, including Azul Cloud Native Compiler (CNC).

The JIT compiler is its own process, deployed as a separate pod in Kubernetes, available to all running JVM instances. Instead of having the same profiling operation and compilation executed in each JVM separately, a compiler in the cloud executes it once, caches the result, and makes it available to multiple JVMs at once to prevent recompilation.

Most importantly, a cloud compiler solves the issue of slow JVM warmups. The profile of the application is known from previous instances. A newly started JVM has instant access to bytecode already precompiled according to the known profile. No JIT compiling. No warming up.

That doesn't just save precious startup time, but it actively prevents wasting CPU and RAM resources in the same operation every single time.

Less Anxiety with Java and the Cloud

What this all comes down to is less anxiety when running Java microservices in the cloud. There is no issue with fast scaling of services, and thanks to the pre-cached and pre-compiled Java classes, we save valuable resources for what our containers are meant to be used for — the service operation.

That in turn means easier maintainability (no warmup processes), easier deployment processes, and cost cutting on many levels (people hours, resources, time). The speed and volume of transactions demand the cloud's limitless resources, and cloud native JVM harness the cloud to great effect.

Simon Ritter is Java Champion and Deputy CTO at Azul
Share this

Industry News

July 16, 2024

OpenText announced its latest product innovations with Cloud Editions (CE) 24.3.

July 16, 2024

Red Hat introduced new capabilities and enhancements for Red Hat OpenShift, the hybrid cloud application platform powered by Kubernetes, as well as the general availability of Red Hat Advanced Cluster Security Cloud Service.

July 16, 2024

DevEx Connect launched as a community-driven independent research, analyst and events organization focusing on everything under the DevEx umbrella, including DevOps, SRE and Platform Engineering.

July 15, 2024

Elastic announced support for Amazon Bedrock-hosted models in Elasticsearch Open Inference API and Playground.

July 11, 2024

Progress announced new and powerful enhancements in the latest release of Progress® LoadMaster® 360, its cloud-based unified application delivery platform. These enhancements help organizations protect their web applications against increasingly sophisticated cyberattacks and provide customers with an optimal application experience.

July 11, 2024

Virtusa announced a strategic partnership with Quality Clouds, a provider of SaaS governance solutions for Salesforce and ServiceNow platforms.

July 11, 2024

Zesty launched its newest offering, Commitment Manager for Amazon RDS (Relational Database Service).

July 10, 2024

MacStadium unveiled Orka Desktop, a free, local macOS virtualization tool.

July 10, 2024

Solo.io announced the release of Gloo AI Gateway, which is designed to meet the emerging use case of accelerating AI innovation.

July 10, 2024

appCD's generative Infrastructure from Code solution now supports Amazon Web Services (AWS) Lambda.

July 10, 2024

Venue.sh, part of The Adaptavist Group, announced the launch of its DevOps platform, a tool catalog that empowers engineering teams to streamline their development and delivery processes, shifting their focus to what really matters.

July 09, 2024

Mirantis announced the release of Mirantis OpenStack for Kubernetes (MOSK) 24.2 with a dynamic resource balancer feature that automates workload distribution to solve hotspot and "noisy neighbor" problems.

July 09, 2024

SmartBear is revolutionizing software quality with its Jira-native test management platform, Zephyr Scale, with no-code test automation and featuring SmartBear HaloAI, now available on the Atlassian Marketplace.

July 09, 2024

Opsera is now available on the Azure Marketplace, an online store providing applications and services for use on Azure.

July 08, 2024

Elastic announced Playground, a low-code interface that enables developers to build RAG applications using Elasticsearch in minutes.