Netskope NewEdge AI Fast Path reduces latency for enterprise AI workloads

Netskope has announced NewEdge AI Fast Path, a set of capabilities designed to optimize network paths to critical AI destinations, including applications hosted in public, private, or neo-cloud environments. The offering reduces latency and costs, improves performance and resilience, and delivers a secure experience for teams using AI applications or enterprises adopting agentic AI.

Eliminating the “security vs. speed” dilemma

The gap between AI expectations and reality is widening, and a recent survey revealed that only 18% of infrastructure and operations (I&O) leaders feel confident that their current team and budget can meet the intensifying performance, resilience, and security demands of the AI. Enterprises want to increase their use of AI, but are often forced into trade-offs between security and user experience due to outdated security tools or an over-reliance on inadequate network infrastructure.

These trade-offs may have significant consequences, as enterprises stall their AI adoption over heightened security concerns, or bypass AI traffic from inspection, or find their users working around critical security controls to avoid performance degradation.

By leveraging Netskope NewEdge, the private cloud that underpins the Netskope One platform for security, networking, analytics, and AI services, customers can avoid these trade-offs. As capabilities within NewEdge, AI Fast Path enables improved performance and efficiency for AI applications.

Specific benefits include:

  • Faster inference results for enterprise users from prompt to response, minimizing “time-to-first-token” (TTFT) for conversational AI.
  • Agentic AI optimization by accelerating complex, multi-prompt agentic workflows with the high-speed processing required for rapid, iterative AI subtasks.
  • Optimization of LLM performance when accessing large volumes of distributed data (for example, via Model Context Protocol gateways).
  • Support for retrieval-augmented generation (RAG) by accelerating the connectivity between LLMs and external data sources for higher quality, real-time outputs.

“With organizations moving at AI speed, any trade-off between security and performance is unacceptable, and also unnecessary,” said Joe DePalo, Chief Platform Officer, Netskope. “Netskope is recognized as a market leader for how we combine security strengths with network performance at the level of, and in some cases even improved over, direct-to-net for virtually any user on the path. That includes optimizing the user experience for business-critical AI. We’re pleased to help customers meet their need for AI speed without adding unnecessary risks.”

More about

Don't miss