Contact Us 1-800-596-4880

LLM Proxy Release Notes

These release notes reflect enhancements, changes, and bug fixes for LLM Proxy.

In addition to these release notes, see:

May 9, 2026

What’s New

  • LLM Proxy now supports Advanced Scale semantic services that use vector databases to store up to 2000 utterance vectors per prompt topic.

  • Agent Network now supports LLM Proxies as Agent Broker LLM providers.

  • LLM Proxy can now dynamically extract API Keys (such as OAuth 2.0 tokens) from incoming requests to support dynamic LLM provider authentication.

  • LLM Proxies now support NVIDIA Nemotron models.

  • Anypoint Monitoring now provides new dashboards for cost management.

March 30, 2026

What’s New

LLM Proxy provides a unified access layer for multiple Large Language Model (LLM) providers. LLM Proxies are deployed to Omni Gateway to enable governance, intelligent routing, and cost management for AI applications.

To learn more, see LLM Proxy Overview.