Skip to content

Commit d0b9d11

Browse files
authored
Document OpenAI compatible API provider usage (#2777)
Added instructions for using OpenAI compatible API provider, including required parameters and a JSON example. It took me a while to notice that, unlike most the common usage of a baseurl, waveterm needs the full endpoint.
1 parent 264d0b0 commit d0b9d11

File tree

1 file changed

+27
-0
lines changed

1 file changed

+27
-0
lines changed

docs/docs/waveai-modes.mdx

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -197,6 +197,33 @@ For newer models like GPT-4.1 or GPT-5, the API type is automatically determined
197197
}
198198
```
199199

200+
### OpenAI Compatible
201+
202+
To use an OpenAPI compatible API provider, you need to provide the ai:endpoint, ai:apitoken, ai:model parameters,
203+
and use "openai-chat" as the ai:mode.
204+
205+
:::note
206+
The ai:endpoint is *NOT* a baseurl. The endpoint should contain the full endpoint, not just the baseurl.
207+
For example: https://api.x.ai/v1/chat/completions
208+
209+
If you provide only the baseurl, you are likely to get a 404 message.
210+
:::
211+
212+
```json
213+
{
214+
"xai-grokfast": {
215+
"display:name": "xAI Grok Fast",
216+
"display:order": 2,
217+
"display:icon": "server",
218+
"ai:apitype": "openai-chat",
219+
"ai:model": "x-ai/grok-4-fast",
220+
"ai:endpoint": "https://api.x.ai/v1/chat/completions",
221+
"ai:apitoken": "<your-api-key>"
222+
}
223+
}
224+
```
225+
226+
200227
### OpenRouter
201228

202229
[OpenRouter](https://openrouter.ai) provides access to multiple AI models. Using the `openrouter` provider simplifies configuration:

0 commit comments

Comments
 (0)