Skip to content

feat(ai-proxy): add per-protocol request_body override and rename max_tokens mapping to llm_options#13269

Merged
nic-6443 merged 2 commits intoapache:masterfrom
nic-6443:feat/ai-proxy-llm-options-and-request-body
Apr 22, 2026
Merged

feat(ai-proxy): add per-protocol request_body override and rename max_tokens mapping to llm_options#13269
nic-6443 merged 2 commits intoapache:masterfrom
nic-6443:feat/ai-proxy-llm-options-and-request-body

Conversation

@nic-6443
Copy link
Copy Markdown
Member

Description

This PR brings back the per-protocol override.request_body deep-merge capability (from the predecessor of #13251) while keeping the provider-aware max_tokens mapping — renamed to override.llm_options.

Why

Users want both override mechanisms:

  • llm_options: Simple, provider-aware max_tokens mapping (set one value, APISIX maps it to the right field per provider). Always force-overwrites.
  • request_body: Fine-grained, per-protocol deep-merge for arbitrary fields like temperature, stop_sequences, stream_options, etc.

Changes

  • Rename override.request_bodyoverride.llm_options (always force-overwrites, no force flag)
  • Restore override.request_body as per-protocol deep-merge keyed by target protocol (openai-chat, openai-responses, etc.)
  • Add override.request_body_force_override (boolean, controls request_body deep-merge only)
  • Precedence: model_optionsllm_optionsrequest_body
  • Recreate merge.lua with force-aware deep_merge
  • Restore protocols.names() for dynamic schema enum

Config example

{
  "override": {
    "llm_options": { "max_tokens": 500 },
    "request_body": {
      "openai-chat": { "temperature": 0.2, "stop": ["Human:"] },
      "anthropic-messages": { "max_tokens": 500 }
    },
    "request_body_force_override": false
  }
}

…_tokens mapping to llm_options

- Rename override.request_body (max_tokens only) to override.llm_options
  which always force-overwrites the client value
- Restore per-protocol override.request_body deep-merge from PR apache#13251's
  predecessor, keyed by target protocol name (openai-chat, openai-responses,
  anthropic-messages, openai-embeddings)
- Add override.request_body_force_override for request_body deep-merge
- Precedence: model_options -> llm_options (always force) -> request_body
  (per-protocol deep merge)
- Recreate merge.lua with force-aware deep_merge that always recurses into
  nested objects regardless of force flag
- Restore protocols.names() for dynamic schema enum
- Update en/zh docs for ai-proxy and ai-proxy-multi
Copilot AI review requested due to automatic review settings April 21, 2026 08:50
@dosubot dosubot Bot added size:XL This PR changes 500-999 lines, ignoring generated files. enhancement New feature or request labels Apr 21, 2026
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Reintroduces per-target-protocol override.request_body deep-merge for ai-proxy/ai-proxy-multi while renaming the provider-aware max_tokens override mechanism to override.llm_options and documenting the new precedence/semantics.

Changes:

  • Add override.llm_options (provider-aware mapping; always force-overwrites) and restore override.request_body as a per-target-protocol deep-merge map with request_body_force_override.
  • Introduce a shared deep_merge helper and apply overrides in the order: model_optionsllm_optionsrequest_body.
  • Update schemas, docs (EN/ZH), and add/adjust tests to cover the new configuration and behaviors.

Reviewed changes

Copilot reviewed 10 out of 10 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
t/plugin/ai-proxy-request-body-override.t Expands test coverage for llm_options mapping + per-protocol request_body deep-merge semantics and precedence.
docs/zh/latest/plugins/ai-proxy.md Updates ZH docs for llm_options + per-protocol request_body override semantics and precedence.
docs/zh/latest/plugins/ai-proxy-multi.md Updates ZH multi-instance docs to reflect llm_options + per-protocol request_body.
docs/en/latest/plugins/ai-proxy.md Updates EN docs for renamed llm_options and restored per-protocol request_body override behavior.
docs/en/latest/plugins/ai-proxy-multi.md Updates EN multi-instance docs to reflect llm_options + per-protocol request_body.
apisix/plugins/ai-proxy/schema.lua Extends schema with override.llm_options and protocol-keyed override.request_body validation.
apisix/plugins/ai-proxy/merge.lua Adds deep_merge helper used to apply per-protocol request body patches.
apisix/plugins/ai-proxy/base.lua Passes new override fields (llm_options, request_body map, force flag) into provider request building.
apisix/plugins/ai-providers/base.lua Applies llm_options via provider hook (forced) and applies per-protocol request_body via deep merge.
apisix/plugins/ai-protocols/init.lua Restores protocols.names() to enumerate registered protocols for schema building.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread apisix/plugins/ai-proxy/merge.lua
Comment thread apisix/plugins/ai-protocols/init.lua
Comment thread apisix/plugins/ai-proxy/merge.lua
@nic-6443 nic-6443 requested a review from moonming April 22, 2026 01:48
@nic-6443 nic-6443 merged commit 70d86ce into apache:master Apr 22, 2026
33 checks passed
@nic-6443 nic-6443 deleted the feat/ai-proxy-llm-options-and-request-body branch April 22, 2026 06:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request size:XL This PR changes 500-999 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants