Cloudflare, the connectivity cloud company, has officially launched its Workers AI platform, making it easier for developers worldwide to build and deploy AI applications. This platform allows developers to run machine learning models on Cloudflare’s global network, providing a serverless solution for GPU-accelerated AI inference. With GPUs operational in over 150 data center locations and plans to expand globally, Cloudflare is democratizing AI by offering low-latency inference capabilities.
Cloudflare has partnered with Hugging Face to provide a curated list of popular open-source models optimized for serverless GPU inference. Developers can choose a model and deploy it with a single click, instantly distributing it across Cloudflare’s global network. The platform supports tasks such as text generation, embeddings and sentence similarity, allowing developers to create domain-specific chatbots that can access contextual data. Cloudflare also introduced advanced techniques like retrieval-augmented generation to enhance AI capabilities.
Workers AI’s serverless nature enables developers to pay only for the resources they consume without managing GPUs or infrastructure, making AI inference more affordable and accessible. Performance and reliability enhancements have been made, including upgraded load balancing capabilities and increased rate limits for language models, improving scalability and responsiveness. Cloudflare also introduced Bring Your Own Low-Rank Adaptation for fine-tuned inference, allowing developers to adapt a subset of a model’s parameters to specific tasks.
In addition to Workers AI, Cloudflare introduced an AI Gateway, acting as a control plane for managing and governing the usage of AI models across an organization. This gateway simplifies integration with AI providers like OpenAI and Hugging Face, offering observability, analytics, and monitoring for enhanced security and governance. Cloudflare’s support for custom LoRA weights and adapters enables efficient multi-tenancy in model hosting and planned expansions in fine-tuning capabilities.
Cloudflare recently added Python support to Workers, its serverless platform for deploying web functions. By expanding its edge network capabilities, Cloudflare is challenging AWS with its GPU-based model inference support, updated load balancers, and AI Gateway for streamlined integration with various providers. With Cloudflare expanding GPU availability across multiple points of presence, developers can access state-of-the-art AI models with low latency and a competitive price/performance ratio, ultimately reshaping the landscape of AI deployment.