Skip to content

fix(validation): fall back to completions probe for custom providers returning 401/403 on /models#894

Open
octo-patch wants to merge 1 commit into
ValueCell-ai:mainfrom
octo-patch:fix/custom-provider-auth-probe-fallback-882
Open

fix(validation): fall back to completions probe for custom providers returning 401/403 on /models#894
octo-patch wants to merge 1 commit into
ValueCell-ai:mainfrom
octo-patch:fix/custom-provider-auth-probe-fallback-882

Conversation

@octo-patch
Copy link
Copy Markdown
Contributor

Fixes #882

Problem

Some custom AI provider implementations (e.g. opencode-go) return 401 or 403 on the GET /models endpoint even when the API key is valid, because they don't expose the standard OpenAI model listing endpoint. Previously, ClawX would immediately interpret these responses as "Invalid API key" and block the user from saving the custom provider — even though the key itself is correct.

Solution

For the custom provider type, when GET /models returns 401 or 403, ClawX now runs a secondary validation probe against POST /chat/completions (or /responses for openai-responses protocol) using the same API key:

  • If the completions probe returns a non-auth error (e.g. 400 "unknown model: validation-probe"), it means the server accepted the auth token — the key is valid and the user's provider is saved.
  • If the probe also returns an auth failure, the original error is returned unchanged — the key is genuinely wrong.

This fallback only applies to the custom provider type. Well-known providers (openai, anthropic, etc.) use dedicated validation profiles and are unaffected.

Note: 400 responses with explicit auth-error messages from /models (e.g. "Invalid API key provided", "无效密钥") are not retried — those already contain a clear rejection from the server and should be surfaced directly.

Testing

Added 4 new unit tests covering:

  1. Custom provider where /models returns 401 → fallback probe returns 400 (valid model error) → key accepted
  2. Custom provider where /models returns 403 → fallback to /responses → key accepted
  3. Custom provider where /models returns 401 → probe also returns 401 → key correctly rejected
  4. Non-custom provider (e.g. openai) where /models returns 401 → no secondary probe (single call, rejected immediately)

All 19 tests in provider-validation.test.ts pass.

pnpm exec vitest run tests/unit/provider-validation.test.ts
# Tests  19 passed (19)

…that return 401/403 on /models (fixes ValueCell-ai#882)

Some custom provider implementations (e.g. opencode-go) return 401 or 403
on the OpenAI-compatible GET /models endpoint even when the API key is
valid, because they simply don't implement that endpoint. Previously, ClawX
would immediately report 'Invalid API key' without retrying, blocking users
from saving their custom provider.

This change adds a secondary validation probe for the custom provider
type: when GET /models returns 401 or 403, ClawX now also tries
POST /chat/completions (or /responses for openai-responses protocol) with
the same key. If the server accepts the token and returns a non-auth error
(e.g. 400 'unknown model: validation-probe'), the key is considered valid.
If the fallback probe also returns an auth failure, the original error is
returned unchanged.

Note: 400 responses with explicit auth error messages from /models are not
retried - those indicate the server explicitly rejected the credentials.
Copy link
Copy Markdown
Contributor

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f75b18a385

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

const probeResult = apiProtocol === 'openai-responses'
? await performResponsesProbe(providerType, probeUrl, headers)
: await performChatCompletionsProbe(providerType, probeUrl, headers);
if (probeResult.valid) return probeResult;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Exclude ambiguous 429 probes when overriding auth failures

In the new custom-provider fallback, any probeResult.valid immediately overrides the original /models 401/403. Because classifyProbeResponse treats HTTP 429 as valid, a provider that rate-limits before credential checks can cause invalid keys to be accepted and saved after a 401/403 from /models. This regression is specific to the new auth-fallback path for custom; consider not accepting 429 as proof of valid auth in this branch.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: opencode-go作为服务商无法添加api_key,提示invalid apikey

1 participant