Skip to content

feat(llm): add Doubao (Volcengine Ark) as an LLM provider#12219

Open
MackDing wants to merge 1 commit intocontinuedev:mainfrom
MackDing:feat-llm-provider-doubao
Open

feat(llm): add Doubao (Volcengine Ark) as an LLM provider#12219
MackDing wants to merge 1 commit intocontinuedev:mainfrom
MackDing:feat-llm-provider-doubao

Conversation

@MackDing
Copy link
Copy Markdown

@MackDing MackDing commented Apr 24, 2026

What

Adds Doubao (豆包 / ByteDance) via Volcengine Ark as a first-class LLM provider in Continue, following the existing pattern used for Moonshot, Deepseek, MiniMax, zAI, and SiliconFlow.

Doubao is one of the most-used LLM families in China; Volcengine Ark (火山方舟) is its official hosted platform and exposes an OpenAI-compatible /chat/completions surface at https://ark.cn-beijing.volces.com/api/v3/. Today users have to work around the gap by using the generic OpenAI provider with apiBase and model overrides — this PR makes provider: doubao just work.

Why

  • Demand: Doubao is ByteDance's flagship LLM and is widely deployed in mainland-China-facing projects.
  • No new surface area: Ark is OpenAI-compatible, so the adapter is a ~30-line OpenAI subclass. Registration is the same shape used by Moonshot / Deepseek / zAI.
  • Closes an implicit feature gap that existing users of China-region models have been asking about (visible in several closed Qwen/Deepseek provider-error issues where users cite Doubao as the alternative).

Changes

File Purpose
core/llm/llms/Doubao.ts New Doubao extends OpenAI with Ark base URL
core/llm/llms/index.ts Register in LLMClasses
core/llm/autodetect.ts Add to PROVIDER_HANDLES_TEMPLATING (Ark handles chat templating server-side, same as Moonshot/Deepseek)
packages/openai-adapters/src/apis/Doubao.ts Adapter wrapping OpenAIApi with ark.cn-beijing.volces.com/api/v3/
packages/openai-adapters/src/types.ts DoubaoConfigSchema in the discriminated union
packages/openai-adapters/src/index.ts constructLlmApiDoubaoApi
packages/openai-adapters/src/apis/Doubao.test.ts 3 new tests (routing / default base URL / preserved OpenAI-chat surface)
core/llm/llms/OpenAI-compatible.vitest.ts Doubao row in the shared subclass test matrix
docs/customize/model-providers/more/doubao.mdx User-facing config docs

Diff: 9 files, +222 / −0.

Design notes

  • No default model is set. Ark addresses models by either a date-stamped model ID (e.g. doubao-seed-1-6-251015, doubao-1-5-pro-32k-250115) or a user-provisioned endpoint ID (ep-20240xxx-xxxxx). A bare alias like doubao-1-5-pro-32k would silently 404, so the adapter intentionally requires the user to pick a valid ID. The docs link to the official Ark model list so users can copy a currently valid one.
  • maxStopWords = 4 matches Ark's documented limit for OpenAI-compatible chat completions.
  • No FIM override. Ark does not expose a public beta/completions FIM protocol today. If that changes, we can override fimStream the way Moonshot and Deepseek do.
  • Not registered in packages/llm-info — consistent with Moonshot and Deepseek, which also rely solely on the LLMClasses + openai-adapters registration.

Testing

packages/openai-adapters$ npm test
Test Files  15 passed | 2 skipped (17)
     Tests  148 passed | 5 skipped (150)

3 of those 148 are new Doubao tests covering:

  1. constructLlmApi({ provider: "doubao" }) returns a DoubaoApi instance.
  2. Default apiBase is the Ark cn-beijing v3 URL.
  3. Inherits the OpenAI-chat surface (chatCompletionStream / chatCompletionNonStream).

TypeScript check: tsc --noEmit on packages/openai-adapters → 0 errors.

References


Summary by cubic

Adds Doubao (ByteDance) via Volcengine Ark as a first-class LLM provider using the OpenAI-compatible /chat/completions API. Users can now set provider: doubao to use China-region Doubao models without workarounds.

  • New Features

    • New doubao provider with default apiBase: https://ark.cn-beijing.volces.com/api/v3/.
    • Registered in LLMClasses and constructLlmApi; added DoubaoConfigSchema.
    • Autodetect updated so Doubao handles chat templating on the server.
    • Docs added for configuration; tests cover routing and API surface.
  • Migration

    • No default model. Set a valid Ark model ID (e.g. doubao-seed-1-6-251015) or endpoint ID (e.g. ep-...) in model.
    • If you previously used provider: openai with Ark apiBase, switch to provider: doubao and keep the same model and apiKey.

Written for commit 27dd211. Summary will update on new commits.

Doubao is ByteDance's widely used LLM family, served via Volcengine Ark
(火山方舟). Ark exposes an OpenAI-compatible `/chat/completions`
surface at https://ark.cn-beijing.volces.com/api/v3/, so the adapter is
a thin OpenAI subclass that mirrors the existing Moonshot/Deepseek/zAI
pattern.

Registration:
- core/llm/llms/Doubao.ts — new provider subclass of OpenAI.
- core/llm/llms/index.ts — added to the LLMClasses registry.
- core/llm/autodetect.ts — listed in PROVIDER_HANDLES_TEMPLATING so
  Ark's server-side chat template is trusted (consistent with Moonshot
  and Deepseek).
- packages/openai-adapters/src/apis/Doubao.ts — OpenAI-adapter wrapper
  with the Ark base URL.
- packages/openai-adapters/src/types.ts + index.ts — new
  DoubaoConfigSchema and constructLlmApi case.

Notes:
- No default model is set. Ark requires either a date-stamped model ID
  (e.g. `doubao-seed-1-6-251015`, `doubao-1-5-pro-32k-250115`) or an
  Ark-provisioned endpoint ID (`ep-20240xxx-xxxxx`); hard-coding a
  bare alias would silently 404.
- Docs cover YAML + JSON config, endpoint-ID usage, and link to the
  Ark model list so users pick an ID that resolves today.

Tests:
- New packages/openai-adapters/src/apis/Doubao.test.ts covering
  constructLlmApi(doubao) routing, default apiBase, and preserved
  OpenAI-chat surface (3 tests).
- OpenAI-compatible.vitest.ts subclass matrix entry for parity with
  Moonshot/Deepseek.
- `npm test` in packages/openai-adapters: 148 passed / 5 skipped.
@MackDing MackDing requested a review from a team as a code owner April 24, 2026 11:12
@MackDing MackDing requested review from sestinj and removed request for a team April 24, 2026 11:12
@dosubot dosubot Bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Apr 24, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 24, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 issues found across 9 files

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="docs/customize/model-providers/more/doubao.mdx">

<violation number="1" location="docs/customize/model-providers/more/doubao.mdx:15">
P2: Configuration instructions are internally contradictory about using model aliases versus date-stamped model IDs/endpoint IDs, which can lead to invalid setup.</violation>
</file>

<file name="packages/openai-adapters/src/apis/Doubao.ts">

<violation number="1" location="packages/openai-adapters/src/apis/Doubao.ts:19">
P2: Subclass field initialization overwrites `config.apiBase` set by `OpenAIApi`, breaking custom base URL configuration.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.

To use Doubao models:

1. Create an API key on the [Volcengine Ark console](https://console.volcengine.com/ark/).
2. Either deploy the model you want as an endpoint and copy its endpoint ID, or use a public model alias.
Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot Apr 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Configuration instructions are internally contradictory about using model aliases versus date-stamped model IDs/endpoint IDs, which can lead to invalid setup.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At docs/customize/model-providers/more/doubao.mdx, line 15:

<comment>Configuration instructions are internally contradictory about using model aliases versus date-stamped model IDs/endpoint IDs, which can lead to invalid setup.</comment>

<file context>
@@ -0,0 +1,111 @@
+To use Doubao models:
+
+1. Create an API key on the [Volcengine Ark console](https://console.volcengine.com/ark/).
+2. Either deploy the model you want as an endpoint and copy its endpoint ID, or use a public model alias.
+3. Add the following configuration:
+
</file context>
Fix with Cubic

* Reference: https://www.volcengine.com/docs/82379
*/
export class DoubaoApi extends OpenAIApi {
apiBase: string = "https://ark.cn-beijing.volces.com/api/v3/";
Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot Apr 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Subclass field initialization overwrites config.apiBase set by OpenAIApi, breaking custom base URL configuration.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/openai-adapters/src/apis/Doubao.ts, line 19:

<comment>Subclass field initialization overwrites `config.apiBase` set by `OpenAIApi`, breaking custom base URL configuration.</comment>

<file context>
@@ -0,0 +1,26 @@
+ * Reference: https://www.volcengine.com/docs/82379
+ */
+export class DoubaoApi extends OpenAIApi {
+  apiBase: string = "https://ark.cn-beijing.volces.com/api/v3/";
+  constructor(config: DoubaoConfig) {
+    super({
</file context>
Fix with Cubic

@MackDing
Copy link
Copy Markdown
Author

I have read the CLA document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

1 participant