Firebase Studio rolled out a Gemini integration, introducing new Gemini-powered features to help developers build applications for React, Flutter, and more.
Cloudflare announced that developers can now deploy AI applications on Cloudflare’s global network in one simple click directly from Hugging Face, an open and collaborative platform for AI builders.
With Workers AI now generally available, Cloudflare is the first serverless inference partner integrated on the Hugging Face Hub for deploying models, enabling developers to quickly, easily, and affordably deploy AI globally, without managing infrastructure or paying for unused compute capacity.
Despite significant strides in AI innovation, there is still a disconnect between its potential and the value it brings businesses. Organizations and their developers need to be able to experiment and iterate quickly and affordably, without having to set up, manage, or maintain GPUs or infrastructure. Businesses are in need of a straightforward platform that unlocks speed, security, performance, observability, and compliance to bring innovative, production-ready applications to their customers faster.
“The recent generative AI boom has companies across industries investing massive amounts of time and money into AI. Some of it will work, but the real challenge of AI is that the demo is easy, but putting it into production is incredibly hard,” said Matthew Prince, CEO and co-founder, Cloudflare. “We can solve this by abstracting away the cost and complexity of building AI-powered apps. Workers AI is one of the most affordable and accessible solutions to run inference. And with Hugging Face and Cloudflare both deeply aligned in our efforts to democratize AI in a simple, affordable way, we’re giving developers the freedom and agility to choose a model and scale their AI apps from zero to global in an instant.”
Workers AI is generally available with GPUs now deployed in more than 150 cities globally
Workers AI provides end-to-end infrastructure needed to scale and deploy AI models efficiently and affordably for the next era of AI applications. Cloudflare now has GPUs deployed across more than 150 cities globally, most recently launching in Cape Town, Durban, Johannesburg, and Lagos for the first locations in Africa, as well as Amman, Buenos Aires, Mexico City, Mumbai, New Delhi, and Seoul, to provide low-latency inference around the world. Workers AI is also expanding to support fine-tuned model weights, enabling organizations to build and deploy more specialized, domain-specific applications.
In addition to Workers AI, Cloudflare’s AI Gateway offers a control plane for your AI applications, allowing developers to dynamically evaluate and route requests to different models and providers, eventually enabling developers to use data to create fine tunes and run the fine-tuned jobs directly on the Workers AI platform.
With Workers AI, developers can now deploy AI models in one click directly from Hugging Face, for the fastest way to access a variety of models and run inference requests on Cloudflare’s global network of GPUs. Developers can choose one of the popular open source models and then simply click “Deploy to Cloudflare Workers AI” to deploy a model instantly. There are 14 curated Hugging Face models now optimized for Cloudflare’s global serverless inference platform, supporting three different task categories including text generation, embeddings, and sentence similarity.
Industry News
Veracode announced a partnership with cloud security provider, Wiz, joining the Wiz Integration (WIN) platform.
Parasoft is adding new autonomous testing capabilities for its flagship testing solutions, including SOAtest, Virtualize and CTP.
OpenText announced the launch of its Cloud Editions (CE) 25.3, a major release that helps organizations harness the power of AI, cloud, and cybersecurity to drive business outcomes.
CircleCI launched its Platform Team Toolkit, a comprehensive solution that eliminates the need for organizations to choose between organizational control and developer velocity.
Harness announced new DevOps Capabilities for Harness AI.
Cycode announced the launch of its AI Exploitability Agent.
GitLab announced the public beta launch of GitLab Duo Agent Platform, a DevSecOps orchestration platform designed to unlock asynchronous collaboration between developers and AI agents.
Check Point® Software Technologies Ltd.(link is external) announced it has been recognized as a Leader in The Forrester Wave™: Zero Trust Platforms, Q3 2025.
Superblocks announced the availability of the Superblocks Platform in the new AI Agents and Tools category of AWS Marketplace.
Kong announced Kong AI Gateway 3.11 with several new features critical in building modern and reliable AI agents in production.
Legit Security announced enhanced capabilities for significant code change and workflow orchestration within its platform.
Couchbase announced the availability of Couchbase Capella™ in the new AI Agents and Tools category of AWS Marketplace.
CloudBees announced the availability of the CloudBees Model Context Protocol (MCP) Server, the latest innovation behind CloudBees Unify, in the new AI Agents and Tools category of AWS Marketplace.