This document explains how OpenCode configuration flows from user files through the plugin system to the Codex API.
- Config Loading Order
- Provider Options Flow
- Model Selection & Persistence
- Plugin Configuration
- Examples
- Best Practices
OpenCode loads and merges configuration from multiple sources in this order (last wins):
~/.opencode/config.json
~/.opencode/opencode.json
~/.opencode/opencode.jsonc
<project>/.opencode/opencode.json
<parent>/.opencode/opencode.json
... (up to worktree root)
OPENCODE_CONFIG=/path/to/config.json opencode
# or
OPENCODE_CONFIG_CONTENT='{"model":"openai/gpt-5"}' opencode# From .well-known/opencode endpoints (for OAuth providers)
https://auth.example.com/.well-known/opencode
Source: tmp/opencode/packages/opencode/src/config/config.ts:26-51
Options are merged at multiple stages before reaching the plugin:
Models.dev provides baseline capabilities for each provider/model.
export OPENAI_API_KEY="sk-..."Plugins can inject options via the loader() function.
{
"provider": {
"openai": {
"options": {
"reasoningEffort": "medium",
"textVerbosity": "low"
}
}
}
}Result: User config overrides everything else.
Source: tmp/opencode/packages/opencode/src/provider/provider.ts:236-339
Your Config (config/opencode-legacy.json):
{
"provider": {
"openai": {
"models": {
"gpt-5-codex-medium": {
"name": "GPT 5 Codex Medium (OAuth)",
"limit": {
"context": 272000,
"output": 128000
},
"options": {
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"textVerbosity": "medium",
"include": [
"reasoning.encrypted_content"
],
"store": false
}
}
}
}
}
}What OpenCode Uses:
- UI Display: "GPT 5 Codex Medium (OAuth)" ✅
- Persistence:
provider_id: "openai"+model_id: "gpt-5-codex-medium"✅ - Plugin lookup:
models["gpt-5-codex-medium"]→ used to build Codex request ✅
The TUI stores recently used models in ~/.opencode/tui:
[[recently_used_models]]
provider_id = "openai"
model_id = "gpt-5-codex"
last_used = 2025-10-12T10:30:00ZKey Point: Custom display names are UI-only. The underlying id field is what gets persisted and sent to APIs.
Source: tmp/opencode/packages/tui/internal/app/state.go:54-79
Plugin Entry Point (index.ts:64-86):
async loader(getAuth: () => Promise<Auth>, provider: unknown) {
const providerConfig = provider as {
options?: Record<string, unknown>;
models?: UserConfig["models"]
};
const userConfig: UserConfig = {
global: providerConfig?.options || {}, // Global options
models: providerConfig?.models || {}, // Per-model options
};
// ... use userConfig in custom fetch()
}type UserConfig = {
global: {
// Applied to ALL models
reasoningEffort?: "minimal" | "low" | "medium" | "high";
textVerbosity?: "low" | "medium" | "high";
include?: string[];
};
models: {
[modelName: string]: {
options?: {
// Override global for specific model
reasoningEffort?: "minimal" | "low" | "medium" | "high";
textVerbosity?: "low" | "medium" | "high";
};
};
};
};For a given model, options are merged:
- Global options (
provider.openai.options) - Model-specific options (
provider.openai.models[modelName].options) ← WINS
Implementation: lib/request/request-transformer.ts:getModelConfig()
{
"plugin": ["oc-chatgpt-multi-auth"],
"provider": {
"openai": {
"options": {
"reasoningEffort": "medium",
"textVerbosity": "medium",
"include": ["reasoning.encrypted_content"]
}
}
}
}Result: All OpenAI models use these options.
{
"plugin": ["oc-chatgpt-multi-auth"],
"provider": {
"openai": {
"options": {
"reasoningEffort": "medium",
"textVerbosity": "medium"
},
"models": {
"gpt-5-codex-high": {
"name": "GPT 5 Codex High (OAuth)",
"options": {
"reasoningEffort": "high",
"reasoningSummary": "detailed"
}
},
"gpt-5-nano": {
"name": "GPT 5 Nano (OAuth)",
"options": {
"reasoningEffort": "minimal",
"textVerbosity": "low"
}
}
}
}
}
}Result:
gpt-5-codex-highusesreasoningEffort: "high"(overridden) +textVerbosity: "medium"(from global)gpt-5-nanousesreasoningEffort: "minimal"+textVerbosity: "low"(both overridden)
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["oc-chatgpt-multi-auth"],
"model": "openai/gpt-5-codex-medium",
"provider": {
"openai": {
"options": {
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"textVerbosity": "medium",
"include": ["reasoning.encrypted_content"]
},
"models": {
"gpt-5-codex-low": {
"name": "GPT 5 Codex Low (OAuth)",
"options": {
"reasoningEffort": "low"
}
},
"gpt-5-codex-high": {
"name": "GPT 5 Codex High (OAuth)",
"options": {
"reasoningEffort": "high",
"reasoningSummary": "detailed"
}
}
}
}
}
}Instead of duplicating global options, override only what's different:
❌ Bad:
{
"models": {
"gpt-5-low": {
"id": "gpt-5",
"options": {
"reasoningEffort": "low",
"textVerbosity": "low",
"include": ["reasoning.encrypted_content"]
}
},
"gpt-5-high": {
"id": "gpt-5",
"options": {
"reasoningEffort": "high",
"textVerbosity": "high",
"include": ["reasoning.encrypted_content"]
}
}
}
}✅ Good:
{
"options": {
"include": ["reasoning.encrypted_content"]
},
"models": {
"gpt-5-low": {
"id": "gpt-5",
"options": {
"reasoningEffort": "low",
"textVerbosity": "low"
}
},
"gpt-5-high": {
"id": "gpt-5",
"options": {
"reasoningEffort": "high",
"textVerbosity": "high"
}
}
}
}Custom model names help you remember what each variant does:
{
"models": {
"GPT 5 Codex - Fast & Cheap": {
"id": "gpt-5-codex",
"options": { "reasoningEffort": "low" }
},
"GPT 5 Codex - Balanced": {
"id": "gpt-5-codex",
"options": { "reasoningEffort": "medium" }
},
"GPT 5 Codex - Max Quality": {
"id": "gpt-5-codex",
"options": { "reasoningEffort": "high" }
}
}
}Most common settings should be global:
{
"options": {
"reasoningEffort": "medium",
"reasoningSummary": "auto",
"textVerbosity": "medium",
"include": ["reasoning.encrypted_content"]
}
}While you can set CODEX_MODE=0 to disable the bridge prompt, it's better to document such settings in config files:
❌ Bad: CODEX_MODE=0 opencode
✅ Good: Create ~/.opencode/openai-codex-auth-config.json:
{
"codexMode": false
}- Check config file syntax with
jq . < config.json - Verify config file location (use absolute paths)
- Check OpenCode logs for config load errors
- Use
OPENCODE_CONFIG_CONTENTto test minimal configs
- TUI remembers the
idfield, not the display name - Check
~/.opencode/tuifor recently used models - Verify your config has the correct
idfield
- Model-specific options override global options
- Plugin receives merged config from OpenCode
- Add debug logging to verify what plugin receives
- ARCHITECTURE.md - Plugin architecture and design decisions
- OpenCode Config Schema - Official schema
- Models.dev - Model capability database