Kong, a cloud connectivity company, has announced the general availability of Kong AI Gateway, an AI-native API gateway designed to govern and secure generative AI workloads across any cloud environment. The product offers a suite of infrastructure capabilities tailored for AI, including support for multiple large language models, semantic caching, semantic routing, semantic firewalling, and model lifecycle management. This gateway provides a unified API interface to manage and secure multiple AI technologies across various applications, enabling enterprises to effectively deploy and scale their generative AI initiatives. According to Marco Palladino, CTO and co-founder of Kong, the Kong AI Gateway is “probably the most capable AI infrastructure technology in the world,” due to its ability to introspect AI traffic and provide a unified API to consume one or more AI providers.
The Kong AI Gateway serves as a central hub, offering a comprehensive set of AI-specific capabilities, including governance, observability, security, and more. It also provides prompt security, compliance, governance, templating, and a lifecycle around AI prompts, as well as “L7 AI observability metrics” to give visibility into provider performance, token usage, and costs across AI traffic. With its unified control plane, Kong Konnect, organizations can monetize their fine-tuned AI models alongside traditional APIs. The launch of Kong AI Gateway comes amid skyrocketing interest in generative AI, and the company’s prescient bet on AI-native infrastructure positions it strongly to enable the next wave of generative AI adoption in the enterprise.











