You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|**OPENAI_API_KEY**| API key for OpenAI models | Optional |
196
-
|**CLAUDE_API_KEY**| API key for Anthropic models | Optional |
197
-
|**GROQ_API_KEY**| API key for speech-to-text features | Optional |
198
-
|**CUSTOM_LLM_ENABLED**| Set to `true` to enable a custom LLM. Supports OpenAI-compatible endpoints and AWS Bedrock. | Optional |
199
-
|**CUSTOM_LLM_PROVIDER**| Backend provider for the custom model. Accepted values: `openai` (default), `bedrock`. | Optional |
200
-
|**CUSTOM_LLM_MODEL_KEY**| Identifier key for the custom model (e.g. a model ID or name). | Optional |
201
-
|**CUSTOM_LLM_BASE_URL**| Base URL of the custom model's OpenAI-compatible endpoint. Required when `CUSTOM_LLM_PROVIDER=openai`. | Optional |
202
-
|**CUSTOM_LLM_API_KEY**| API key for authenticating with the custom endpoint. Required for `openai` provider; used as the AWS access key ID when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
203
-
|**CUSTOM_LLM_AWS_REGION**| AWS region for the Bedrock model (e.g. `us-east-1`). Required when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
204
-
|**CUSTOM_LLM_NAME**| Display name for the custom model shown in the UI. Defaults to `Custom LLM`. | Optional |
205
-
|**CUSTOM_LLM_MAX_TOKENS**| Maximum token limit for the custom model. Defaults to `128000`. | Optional |
|**OPENAI_API_KEY**| API key for OpenAI models| Optional |
196
+
|**CLAUDE_API_KEY**| API key for Anthropic models| Optional |
197
+
|**GROQ_API_KEY**| API key for speech-to-text features| Optional |
198
+
|**CUSTOM_LLM_ENABLED**| Set to `true` to enable a custom LLM. Supports OpenAI-compatible endpoints and AWS Bedrock.| Optional |
199
+
|**CUSTOM_LLM_PROVIDER**| Backend provider for the custom model. Accepted values: `openai` (default), `bedrock`.| Optional |
200
+
|**CUSTOM_LLM_MODEL_KEY**| Identifier key for the custom model (e.g. a model ID or name).| Optional |
201
+
|**CUSTOM_LLM_BASE_URL**| Base URL of the custom model's OpenAI-compatible endpoint. Required when `CUSTOM_LLM_PROVIDER=openai`.| Optional |
202
+
|**CUSTOM_LLM_API_KEY**| API key for authenticating with the custom endpoint. Required for `openai` provider; used as the AWS access key ID when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
203
+
|**CUSTOM_LLM_AWS_REGION**| AWS region for the Bedrock model (e.g. `us-east-1`). Required when `CUSTOM_LLM_PROVIDER=bedrock`. | Optional |
204
+
|**CUSTOM_LLM_NAME**| Display name for the custom model shown in the UI. Defaults to `Custom LLM`.| Optional |
205
+
|**CUSTOM_LLM_MAX_TOKENS**| Maximum token limit for the custom model. Defaults to `128000`. | Optional |
Copy file name to clipboardExpand all lines: docs/self-hosting/govern/plane-ai.md
+1-2Lines changed: 1 addition & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -53,7 +53,6 @@ One custom model can be configured alongside your public provider keys.
53
53
The custom model should have at least 100 billion parameters for all Plane AI features to work reliably. Larger, more capable models yield better results.
54
54
:::
55
55
56
-
57
56
### Embedding models
58
57
59
58
Embedding models power semantic search. Plane AI supports:
@@ -131,7 +130,7 @@ CUSTOM_LLM_MAX_TOKENS=128000
131
130
132
131
::: warning
133
132
For Bedrock, the IAM user must have `bedrock:InvokeModel` permission on the target model.
0 commit comments