Cloudflare Adds New Capabilities to Workers AI
September 26, 2024

Cloudflare announced powerful new capabilities for Workers AI, the serverless AI platform, and its suite of AI application building blocks, to help developers build faster, more powerful and more performant AI applications.

Applications built on Workers AI can now benefit from faster inference, bigger models, improved performance analytics, and more. Workers AI is the easiest platform to build global AI applications and run AI inference close to the user, no matter where in the world they are.

Cloudflare’s globally distributed network helps to minimize network latency, setting it apart from other networks that are typically made up of concentrated resources in limited data centers. Cloudflare’s serverless inference platform, Workers AI, now has GPUs in more than 180 cities around the world, built for global accessibility to provide low latency times for end users all over the world. With this network of GPUs, Workers AI has one of the largest global footprints of any AI platform, and has been designed to run AI inference locally as close to the user as possible and help keep customer data closer to home.

“As AI took off last year, no one was thinking about network speeds as a reason for AI latency, because it was still a novel, experimental interaction. But as we get closer to AI becoming a part of our daily lives, the network, and milliseconds, will matter,” said Matthew Prince, co-founder and CEO, Cloudflare. “As AI workloads shift from training to inference, performance and regional availability are going to be critical to supporting the next phase of AI. Cloudflare is the most global AI platform on the market, and having GPUs in cities around the world is going to be what takes AI from a novel toy to a part of our everyday life, just like faster Internet did for smartphones.”

Cloudflare is also introducing new capabilities that make it the easiest platform to build AI applications with:

- Upgraded performance and support for larger models: Now, Cloudflare is enhancing their global network with more powerful GPUs for Workers AI to upgrade AI inference performance and run inference on significantly larger models like Llama 3.1 70B, as well as the collection of Llama 3.2 models with 1B, 3B, 11B (and 90B soon). By supporting larger models, faster response times, and larger context windows, AI applications built on Cloudflare’s Workers AI can handle more complex tasks with greater efficiency – thus creating natural, seamless end-user experiences.

- Improved monitoring and optimizing of AI usage with persistent logs: New persistent logs in AI Gateway, available in open beta, allow developers to store users’ prompts and model responses for extended periods to better analyze and understand how their application performs. With persistent logs, developers can gain more detailed insights from users’ experiences, including cost and duration of requests, to help refine their application. Over two billion requests have traveled through AI Gateway since launch last year.

- Faster and more affordable queries: Vector databases make it easier for models to remember previous inputs, allowing machine learning to be used to power search, recommendations, and text generation use-cases. Cloudflare’s vector database, Vectorize, is now generally available, and as of August 2024 now supports indexes of up to five million vectors each, up from 200,000 previously. Median query latency is now down to 31 milliseconds (ms), compared to 549 ms. These improvements allow AI applications to find relevant information quickly with less data processing, which also means more affordable AI applications.

Share this

Industry News

October 03, 2024

Check Point® Software Technologies Ltd. announced its position as a leader in The Forrester Wave™: Enterprise Firewalls, Q4 2024 report.

October 03, 2024

Sonar announced two new product capabilities for today’s AI-driven software development ecosystem.

October 03, 2024

Redgate announced a wide range of product updates supporting multiple database management systems (DBMS) across its entire portfolio, designed to support IT professionals grappling with today’s complex database landscape.

October 03, 2024

Elastic announced support for Google Cloud’s Vertex AI platform in the Elasticsearch Open Inference API and Playground.

October 02, 2024

Progress announced the recipients of its 2024 Women in STEM Scholarship Series.

October 02, 2024

SmartBear has integrated the load testing engine of LoadNinja into its automated testing tool, TestComplete.

October 01, 2024

Check Point® Software Technologies Ltd. announced the completion of its acquisition of Cyberint Technologies Ltd., a highly innovative provider of external risk management solutions.

October 01, 2024

Lucid Software announced a robust set of new capabilities aimed at elevating agile workflows for both team-level and program-level planning.

October 01, 2024

Perforce Software announced the Hadoop Service Bundle, a new professional services and support offering from OpenLogic by Perforce.

October 01, 2024

CyberArk announced the successful completion of its acquisition of Venafi, a provider of machine identity management, from Thoma Bravo.

October 01, 2024

Inflectra announced the launch of its AI-powered SpiraApps.

October 01, 2024

The former Synopsys Software Integrity Group has rebranded as Black Duck® Software, a newly independent application security company.

September 30, 2024

Check Point® Software Technologies Ltd. announced that it has been recognized as a Visionary in the 2024 Gartner® Magic Quadrant™ for Endpoint Protection Platforms.

September 30, 2024

Harness expanded its strategic partnership with Google Cloud, focusing on new integrations leveraging generative AI technologies.

September 30, 2024

OKX announced the launch of OKX OS, an onchain infrastructure suite.