News

Kong announces new open source AI gateway with multi-LLM support

0

Kong, a developer of cloud API technologies, has announced a suite of open-source AI plugins for Kong Gateway 3.6 that can turn any Kong Gateway deployment into an AI Gateway, offering support for multi-Language Learning Models (LLMs) integration.

By upgrading to Kong Gateway 3.6, available today, users can access a suite of six new plugins that are entirely focused on AI and LLM usage. This will enable developers who want to integrate one or more LLMs into their products to be more productive and ship AI capabilities faster, while at the same time offering architects and platform teams a secure solution that ensures visibility, control and compliance on every AI request sent by the teams.

“By open-sourcing this suite of innovative AI capabilities, including no-code AI plugins, we’re removing the barriers to AI adoption and making it possible for developers to leverage multiple LLMs effortlessly and ship AI powered applications faster. At the same time, we’re providing governance and visibility to all the AI traffic that is being generated by an organisation,” said Marco Palladino, Chief Technology Officer and Co-Founder, Kong Inc.

By upgrading to Kong Gateway 3.6, AI builders can access this new suite of plugins entirely focused on AI and LLM usage. The suite of open source plugins delivers a range of new capabilities, including:

  • Multi-LLM Integration: Kong Inc.’s “ai-proxy” plugin enables seamless integration of multiple Large Language Model (LLM) implementations, offering native support for industry leaders including OpenAI, Azure AI, Cohere, Anthropic, Mistral, and LLAMA. The standardised interface allows for simple switching between LLMs without modifying application code, facilitating the use of diverse models and rapid prototyping.
  • Central AI Credential Management: The “ai-proxy” helps ensure secure and centralised storage of AI credentials within Kong Gateway. This design negates the need for credentials within applications, streamlining credential rotation and updates directly from the gateway.
  • Layer 7 AI Metrics Collection: Leveraging the “ai-proxy” plugin, users can now capture detailed Layer 7 AI analytics. This includes metrics such as request and response token counts, along with usage data for LLM providers and models. Integration with third-party platforms like Datadog, New Relic, and existing logging plugins in Kong Gateway, like TCP, Syslog, Prometheus, is supported, enriching observability and offering insights into developer preferences.
  • No-Code AI Integrations: With the “ai-request-transformer” and “ai-response-transformer” plugins, AI capabilities are injected into API requests and responses without a single line of code. This allows for on-the-fly transformations like real-time API response translations for internationalisation, enriching and converting API traffic effortlessly.