Postman announced Agent Mode, an AI-native assistant that delivers real productivity gains across the entire API lifecycle.
For at least the last ten years, every year was the year of Java's demise. Still, Java is here and it is as strong as ever. Not only that, but it is also very present together with state-of-the-art technologies, such as clouds, containers and resource managers.
However, running Java, or any other JVM language, in the cloud comes with hurdles. Microservice architectures, quick scaling operations (scale up or down) to balance load requirements and cost effectiveness, as well as environments such as Kubernetes (k8s) come with their own complexity in terms of Java.
Java and the Cloud
That said, one of the biggest obstacles with Java-based microservices is the slow startup time of Java applications. A lot of this can be mitigated by using microservice frameworks like Quarkus, but the JVM still has heavy lifting with class loading and initial Just-in-Time (JIT) compilation.
When talking about startup time and Java, it would be more accurate to say warmup time. Warming up a JVM means to feed real-world workloads to the application to give the JVM enough profiling data to optimize the application for highest speed with that specific performance profile. This adds to the loading time of the JVM and application itself, and it happens every single time you start the application. Even if you already have an instance of the same application running.
It also works against the newly acquired "Let it crash" mentality. For decades we, as engineers, learned to handle exception and failure scenarios. Everyone knows how hard it can be to recover from these issues and get the application back into a meaningful and consistent state. The opposite of that sentiment is to just let the application crash and restart it. With the JVM, however, this brings us right back to the warmup time issue.
Likewise, we see a similar problematic situation with quickly and / or frequently scaling up and down. When running in the cloud, one of the major cost efficiency factors is to keep the provided service resources consistent with the current load profile. That means scaling up if we see higher loads and scaling down if the load profile slows down. Introducing an additional instance of our Java application will cause a delay. The warmup delay can be just a minute, but some applications can take much longer to warm up and be ready to serve requests at full speed.
The Cloud Native Compiler (CNC) Makes Compilation More Efficient
This issue can be mitigated through sharing the code and JIT compilation results by moving the compiler outside of the JVM. Such compilers exist, including Azul Cloud Native Compiler (CNC).
The JIT compiler is its own process, deployed as a separate pod in Kubernetes, available to all running JVM instances. Instead of having the same profiling operation and compilation executed in each JVM separately, a compiler in the cloud executes it once, caches the result, and makes it available to multiple JVMs at once to prevent recompilation.
Most importantly, a cloud compiler solves the issue of slow JVM warmups. The profile of the application is known from previous instances. A newly started JVM has instant access to bytecode already precompiled according to the known profile. No JIT compiling. No warming up.
That doesn't just save precious startup time, but it actively prevents wasting CPU and RAM resources in the same operation every single time.
Less Anxiety with Java and the Cloud
What this all comes down to is less anxiety when running Java microservices in the cloud. There is no issue with fast scaling of services, and thanks to the pre-cached and pre-compiled Java classes, we save valuable resources for what our containers are meant to be used for — the service operation.
That in turn means easier maintainability (no warmup processes), easier deployment processes, and cost cutting on many levels (people hours, resources, time). The speed and volume of transactions demand the cloud's limitless resources, and cloud native JVM harness the cloud to great effect.
Industry News
Progress Software announced the Q2 2025 release of Progress® Telerik® and Progress® Kendo UI®, the .NET and JavaScript UI libraries for modern application development.
Voltage Park announced the launch of its managed Kubernetes service.
Cobalt announced a set of powerful product enhancements within the Cobalt Offensive Security Platform aimed at helping customers scale security testing with greater clarity, automation, and control.
LambdaTest announced its partnership with Assembla, a cloud-based platform for version control and project management.
Salt Security unveiled Salt Illuminate, a platform that redefines how organizations adopt API security.
Workday announced a new unified, AI developer toolset to bring the power of Workday Illuminate directly into the hands of customer and partner developers, enabling them to easily customize and connect AI apps and agents on the Workday platform.
Pegasystems introduced Pega Agentic Process Fabric™, a service that orchestrates all AI agents and systems across an open agentic network for more reliable and accurate automation.
Fivetran announced that its Connector SDK now supports custom connectors for any data source.
Copado announced that Copado Robotic Testing is available in AWS Marketplace, a digital catalog with thousands of software listings from independent software vendors that make it easy to find, test, buy, and deploy software that runs on Amazon Web Services (AWS).
Check Point® Software Technologies Ltd.(link is external) announced major advancements to its family of Quantum Force Security Gateways(link is external).
Sauce Labs announced the general availability of iOS 18 testing on its Virtual Device Cloud (VDC).
Infragistics announced the launch of Infragistics Ultimate 25.1, the company's flagship UX and UI product.
CIQ announced the creation of its Open Source Program Office (OSPO).
Check Point® Software Technologies Ltd.(link is external) announced the launch of its next generation Quantum(link is external) Smart-1 Management Appliances, delivering 2X increase in managed gateways and up to 70% higher log rate, with AI-powered security tools designed to meet the demands of hybrid enterprises.