Skip to content

Commit d47f464

Browse files
committed
AI Bedrock support
1 parent 224c2c0 commit d47f464

2 files changed

Lines changed: 26 additions & 17 deletions

File tree

docs/self-hosting/govern/environment-variables.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -195,13 +195,14 @@ Plane AI supports multiple LLM providers. Configure one or more by adding their
195195
| **OPENAI_API_KEY** | API key for OpenAI models | Optional |
196196
| **CLAUDE_API_KEY** | API key for Anthropic models | Optional |
197197
| **GROQ_API_KEY** | API key for speech-to-text features | Optional |
198-
| **CUSTOM_LLM_ENABLED** | Set to `true` to use a custom LLM with an OpenAI-compatible API | Optional |
199-
| **CUSTOM_LLM_MODEL_KEY** | Identifier key for the custom model | Optional |
200-
| **CUSTOM_LLM_BASE_URL** | Base URL of the custom model's OpenAI-compatible endpoint | Optional |
201-
| **CUSTOM_LLM_API_KEY** | API key for the custom endpoint | Optional |
202-
| **CUSTOM_LLM_NAME** | Display name for the custom model | Optional |
203-
| **CUSTOM_LLM_DESCRIPTION** | Description of the custom model | Optional |
204-
| **CUSTOM_LLM_MAX_TOKENS** | Maximum token limit for the custom model | Optional |
198+
| **CUSTOM_LLM_ENABLED** | Set to `true` to enable a custom LLM. Supports OpenAI-compatible endpoints and AWS Bedrock. | Optional |
199+
| **CUSTOM_LLM_PROVIDER** | Backend provider for the custom model. Accepted values: `openai` (default), `bedrock`. | Optional |
200+
| **CUSTOM_LLM_MODEL_KEY** | Identifier key for the custom model (e.g. a model ID or name). | Optional |
201+
| **CUSTOM_LLM_BASE_URL** | Base URL of the custom model's OpenAI-compatible endpoint. Required when `CUSTOM_LLM_PROVIDER=openai`. | Optional |
202+
| **CUSTOM_LLM_API_KEY** | API key for authenticating with the custom endpoint. Required for `openai` provider; used as the AWS access key ID when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
203+
| **CUSTOM_LLM_AWS_REGION** | AWS region for the Bedrock model (e.g. `us-east-1`). Required when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
204+
| **CUSTOM_LLM_NAME** | Display name for the custom model shown in the UI. Defaults to `Custom LLM`. | Optional |
205+
| **CUSTOM_LLM_MAX_TOKENS** | Maximum token limit for the custom model. Defaults to `128000`. | Optional |
205206

206207
#### Provider base URLs
207208

docs/self-hosting/govern/plane-ai.md

Lines changed: 18 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -42,12 +42,18 @@ You can provide API keys for both OpenAI and Anthropic, making all models availa
4242

4343
#### Custom models (self-hosted or third-party)
4444

45-
Plane AI works with any model exposed through an OpenAI-compatible API, including models served by Ollama, Groq, Cerebras, and similar runtimes. You can configure one custom model alongside your public provider keys.
45+
Plane AI supports custom models through two backends:
4646

47-
:::warning
48-
For reliable performance across all Plane AI features, use a custom model with at least 100 billion parameters. Larger models produce better results.
47+
- **OpenAI-compatible endpoint** — any model exposed via an OpenAI-compatible API, including models served by Ollama, Groq, Cerebras, and similar runtimes.
48+
- **AWS Bedrock** — models accessed directly through Amazon Bedrock using your AWS credentials.
49+
50+
One custom model can be configured alongside your public provider keys.
51+
52+
::: warning
53+
The custom model should have at least 100 billion parameters for all Plane AI features to work reliably. Larger, more capable models yield better results.
4954
:::
5055

56+
5157
### Embedding models
5258

5359
Embedding models power semantic search. Plane AI supports:
@@ -109,21 +115,23 @@ CLAUDE_API_KEY=xxxxxxxxxxxxxxxx
109115

110116
### Custom model
111117

112-
Use this for self-hosted models or third-party OpenAI-compatible endpoints.
113-
114118
```bash
115119
CUSTOM_LLM_ENABLED=true
120+
CUSTOM_LLM_PROVIDER=openai # or 'bedrock'
116121
CUSTOM_LLM_MODEL_KEY=your-model-key
117-
CUSTOM_LLM_BASE_URL=http://your-endpoint/v1
118122
CUSTOM_LLM_API_KEY=your-api-key
119123
CUSTOM_LLM_NAME=Your Model Name
120-
CUSTOM_LLM_DESCRIPTION="Optional description"
121124
CUSTOM_LLM_MAX_TOKENS=128000
122125
```
123126

124-
:::info
125-
The custom endpoint must expose an OpenAI-compatible API matching OpenAI's request and response format.
126-
:::
127+
**Additional required variables by provider:**
128+
129+
- **OpenAI-compatible** (`openai`): `CUSTOM_LLM_BASE_URL`
130+
- **AWS Bedrock** (`bedrock`): `CUSTOM_LLM_AWS_REGION`
131+
132+
::: warning
133+
For Bedrock, the IAM user must have `bedrock:InvokeModel` permission on the target model.
134+
:::
127135

128136
### Speech-to-text (optional)
129137

0 commit comments

Comments
 (0)