LLM Proxy Policies
By default, LLM Proxy applies these policies:
-
Client ID Enforcement
-
LLM Proxy Core Policy
-
Model Based Routing Policy or Semantic Routing Policy (policy name dependent on embedded service provider)
You don’t need to modify these policies.
LLM Proxy supports most included policies, but doesn’t support outbound policies.
| LLM Proxy doesn’t support Rate Limiting: SLA-Based Policy. |
These policies are specfic and useful for LLM Proxies:
Apply Policies to LLM Proxies
-
From API Manager, click LLM Proxies.
-
Click the name of the LLM Proxy you want to apply a policy to.
-
Click AI Policies.
-
Click + Add inbound policy.
-
Select the policy to apply.
-
Configure the required parameters.
For policy configuration parameters, see Inbound Policies Directory.
-
If necessary, configure Advanced options.
-
Click Apply.
LLM Proxy Authentication Policy
By default, the LLM Proxy has the Client ID Enforcement policy applied.
This is required because the Client ID Enforcement populates the Authentication.clientName variable in the Authentication object that is used as a unique identifier for LLM Metrics.
To remove the Client ID Enforcement policy, ensure that you either:
-
Apply a policy that populates
Authentication.clientName:-
OAuth 2.0 Token Introspection Policy (If Client ID enforcement is configured,
skipClientIdValidation=false) -
OpenID Connect OAuth 2.0 Token Enforcement Policy (If Client ID enforcement is configured,
skipClientIdValidation=false) -
JWT Validation Policy (If Client ID enforcement is configured,
skipClientIdValidation=false) -
A custom policy that populates
Authentication.clientName
-
Edit the dataweave variable in LLM Proxy Core to extract a different unique identifier, such as
clientid,userid, ordepartmentid.You can’t filter by this unique identifier in Usage Reports.



