Kong, a leading API company, has introduced its open source AI Gateway, an innovative extension of its existing API gateway. This new tool empowers developers and operations teams to seamlessly integrate applications with various large language models (LLMs) through a unified API. Additionally, Kong is rolling out several AI-centric features, including prompt engineering, credential management, and more.
"We view AI as a valuable use case for APIs," explained Kong co-founder and CTO Marco Palladino. "APIs evolve with diverse use cases—mobile apps, microservices, and now AI. As we explore AI, we're analyzing APIs across the spectrum, from AI consumption to fine-tuning and even the AI systems themselves. Increased AI utilization will inevitably drive greater API consumption globally."
Palladino shared that while organizations are eager to leverage AI, many harbor concerns about data security. He anticipates that companies will eventually prefer running their models locally, utilizing cloud options as backups. For now, they must also address challenges such as credential management for accessing cloud-based models, controlling and logging traffic, and managing usage quotas.
"Our API gateway was designed to enhance developer productivity in building AI applications, allowing them to utilize multiple LLM providers without altering their existing code," he noted. Currently, the gateway supports major models from Anthropic, Azure, Cohere, Meta’s LLaMA, Mistral, and OpenAI.
Kong’s team asserts that most API providers treat AI APIs similarly to standard APIs. However, by integrating specialized AI features, Kong aims to facilitate new use cases and simplify existing implementations. With AI response and request transformers, developers can dynamically modify prompts and results, enabling functionalities like automatic translation or the removal of personally identifiable information.
Moreover, the gateway incorporates robust prompt engineering capabilities, allowing businesses to enforce their usage guidelines centrally. This integration means a streamlined management system for prompts and guidelines.
Nearly nine years have passed since the launch of the Kong API Management Platform. At its inception, the organization operated under the name Mashape. In a recent interview, co-founder and CEO Augusto Marietti reflected on their journey, stating, “Mashape was struggling, but Kong has now become the leading API product on GitHub.” Marietti revealed that Kong was cash-flow positive in the last quarter and is not currently pursuing funding, marking a successful transformation.
Currently, the Kong Gateway serves as the backbone of the company’s platform and powers the new AI Gateway. Existing Kong users can easily upgrade their installations to access the latest AI features.
For now, these new AI capabilities are available at no cost. In the future, Kong plans to introduce premium paid features, although the focus for this release is not on monetization.