Skip to content

feat: add Ling support with system prompt and inference params#9139

Merged
5 commits merged intoKilo-Org:mainfrom
tangxinyao:feat/ling
Apr 21, 2026
Merged

feat: add Ling support with system prompt and inference params#9139
5 commits merged intoKilo-Org:mainfrom
tangxinyao:feat/ling

Conversation

@tangxinyao
Copy link
Copy Markdown
Contributor

Context

Add support for the "Ling" model. Similar to existing Chinese LLMs like Kimi, Qwen, and MiniMax, the Ling model requires a dedicated system prompt and tuned inference parameters for better code generation performance.

Implementation

Changes span four areas:

  1. New dedicated system prompt
    Added packages/opencode/src/session/prompt/ling.txt, based on the standard Kilo system prompt with full tool usage guidelines, code style requirements, and task execution
    policies.

  2. System prompt routing
    In session/system.ts:

  • Added a case "ling" branch so models with prompt: "ling" use the ling-specific prompt.
  • Added auto-detection logic: models whose model.api.id contains "ling" are automatically routed to PROMPT_LING.
  1. Inference parameter tuning
    In provider/transform.ts, three parameters are configured for ling models:
  • temperature: 0.3 (low temperature for more deterministic code output)
  • topP: 0.95
  • topK: 20
  1. Type system sync
  • kilo-gateway/src/api/constants.ts: added "ling" to the PROMPTS constant.
  • sdk/js/src/v2/gen/types.gen.ts: added "ling" to the Model.prompt union type to keep the SDK in sync with the backend.

Screenshots

before after
Ling models fall back to the default prompt with no dedicated inference parameters │ Ling models automatically use the dedicated system prompt with tuned temperature/topP/topK

How to Test

  1. Add a model whose id contains "ling" (e.g. ling-flash) to the model config.
  2. Start a conversation and verify the system prompt matches the content of ling.txt (via logs or debug mode).
  3. Alternatively, set prompt: "ling" on a model in the gateway config and confirm it routes to the ling prompt.
  4. Add a log in transform.ts to verify temperature returns 0.3, topP returns 0.95, and topK returns 20 for ling models.

Get in Touch

email: sha7tang@gmail.com

Comment thread packages/opencode/src/provider/transform.ts Outdated
@kilo-code-bot
Copy link
Copy Markdown
Contributor

kilo-code-bot Bot commented Apr 17, 2026

Code Review Summary

Status: 1 Issues Found | Recommendation: Address before merge

Overview

Severity Count
CRITICAL 0
WARNING 1
SUGGESTION 0

Fix these issues in Kilo Cloud

Issue Details (click to expand)

WARNING

File Line Issue
packages/opencode/src/provider/transform.ts 334 Ling tuning checks model.id, so aliased config models route to the Ling prompt but skip the Ling temperature/topP/topK overrides.
Other Observations (not in diff)

Issues found in unchanged code that cannot receive inline comments:

File Line Issue
None - No additional issues found outside the diff.
Files Reviewed (5 files)
  • packages/kilo-gateway/src/api/constants.ts - 0 issues
  • packages/opencode/src/provider/transform.ts - 1 issue
  • packages/opencode/src/session/prompt/ling.txt - 0 issues
  • packages/opencode/src/session/system.ts - 0 issues
  • packages/sdk/js/src/v2/gen/types.gen.ts - 0 issues

Reviewed by gpt-5.4-2026-03-05 · 752,768 tokens

@kilo-code-bot
Copy link
Copy Markdown
Contributor

kilo-code-bot Bot commented Apr 21, 2026

The kilocode_change annotations required by CI have been added in #9296, which supersedes this PR (since pushing to a fork branch was not possible with the available token). Please close this PR in favour of #9296.

@chrarnoldus chrarnoldus self-assigned this Apr 21, 2026
@chrarnoldus chrarnoldus closed this pull request by merging all changes into Kilo-Org:main in 8ac2231 Apr 21, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants