F5 BIG-IP v21.0 accelerates enterprise AI initiatives
F5 introduced BIG-IP v21.0, giving customers a unified approach to app delivery, security, and scale in the AI era.
This major release extends the F5 Application Delivery and Security Platform (ADSP) with a purpose-built delivery engine for application workloads—reducing operational friction, accelerating data movement, and improving performance and resiliency across hybrid and multicloud environments. Introducing the next generation of app services innovation, BIG-IP v21.0 is now generally available, and F5 encourages customers to begin planning migrations to take advantage of its AI-ready platform improvements.
AI-driven applications have fundamentally different delivery requirements. Data must move with much higher throughput—across storage, compute, clusters, and clouds—while maintaining security and low latency. However, most enterprises are still managing fragmented delivery stacks and sprawling infrastructures that were never designed for AI data volumes. According to F5’s State of Application Strategy Report 2025, 94% of organizations deploy apps across multiple environments, which means complexity isn’t an edge case—it’s the norm.
“BIG-IP is now built for the AI era,” said Kunal Anand, CPO at F5. “We deliver the hard part of AI at scale: moving data to and from models with speed, integrity, and control. And we do it with the full BIG-IP stack, from delivery to security, while boosting performance and throughput. Customers can run their most critical AI workloads with confidence.”
BIG-IP v21.0 introduces specific product-level capabilities that directly advance AI data delivery, security, and scale:
AI data delivery enhancements: Optimize performance and simplify configuration with new S3 data storage integrations. Use cases include secure ingestion for fine-tuning and batch inference, high-throughput retrieval for RAG and embeddings generation, policy-driven model artifact distribution with observability, and controlled egress with consistent security and compliance.
Model Context Protocol (MCP) support for AI traffic: Accelerate and scale AI workloads with support for MCP that enables seamless communication between AI models, applications, and data sources. This enhances performance, secures connections, and streamlines deployment for AI workloads.
Modernized control plane performance: Boost reliability and accelerate configuration changes across large hybrid portfolios with control plane improvements. BIG-IP v21.0 delivers optimized control plane performance, enabling higher throughput and faster operations critical for high-scale AI deployments where configuration velocity matters.
“AI-driven workloads are evolving and accelerating demand for secure, high-performance application delivery across hybrid and multicloud environments,” said Paul Nicholson, Research Vice President, Cloud and Datacenter Networks, IDC. “F5’s BIG-IP v21.0 accelerates enterprise AI initiatives by optimizing data delivery with S3 profiles, adding Model Context Protocol support, and introducing system and security enhancements for more efficient operations.”
BIG-IP v21.0 enables Platform Ops, NetOps, and cloud teams to standardize on one delivery layer across environments, while giving enterprises the performance, control, and observability needed to scale AI with confidence.
Additionally, F5 has invested in security upgrades to make BIG-IP v21.0 the company’s most secure release yet. Maintaining exceptional software quality with a focus on security will continue as F5’s highest priority in this and subsequent BIG-IP releases. BIG-IP v21.0 will also add support for the recently announced CrowdStrike Falcon integration by the end of the quarter.