Skip to content

Commit 636ac7f

Browse files
docs: add LM Studio system prompt guide and MCP description hint section
1 parent b4f8f62 commit 636ac7f

1 file changed

Lines changed: 63 additions & 3 deletions

File tree

README.md

Lines changed: 63 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -126,12 +126,72 @@ python lmstudio_bridge.py
126126

127127
For complete MCP configuration instructions, see [MCP_CONFIGURATION.md](MCP_CONFIGURATION.md).
128128

129+
### Optional: MCP `description` hint
130+
131+
You can add a `description` field to your `.mcp.json` entry to help Claude understand when to use this server and what to expect. This is particularly useful for reminding Claude of version requirements:
132+
133+
```json
134+
{
135+
"lmstudio-mcp": {
136+
"command": "...",
137+
"args": [...],
138+
"description": "Local LLM bridge via LM Studio. Use for private/offline inference, embeddings, and multi-turn conversations. start_conversation and continue_conversation require LM Studio v0.3.29+."
139+
}
140+
}
141+
```
142+
143+
## 🧠 LM Studio System Prompt (Recommended)
144+
145+
Setting a system prompt directly in LM Studio gives your local model a consistent baseline personality and behaviour across all interactions — without needing to pass it on every API call.
146+
147+
### How to set it
148+
149+
1. Open **LM Studio**
150+
2. Click the model name at the top of the chat panel
151+
3. Find the **System Prompt** field (may be under a ⚙️ gear icon or **Advanced settings**)
152+
4. Paste your system prompt and save
153+
154+
> The system prompt set here applies to all completions sent via the API, including those from this MCP bridge.
155+
156+
### Example system prompts
157+
158+
**General assistant — clean and direct:**
159+
```
160+
You are a helpful, concise assistant. Answer directly without preamble like
161+
"Sure!" or "Of course!". Never cut off mid-sentence — always finish your thought.
162+
```
163+
164+
**Casual conversation partner:**
165+
```
166+
You are a regular person having a relaxed conversation with a friend.
167+
Keep responses short and natural, like real chat. No bullet points or formal
168+
language. You can invent fun details about your life and stay consistent with them.
169+
Never cut off mid-sentence — always finish your thought.
170+
```
171+
172+
**Local coding assistant:**
173+
```
174+
You are an expert software engineer. Be concise and precise. When writing code,
175+
always include brief inline comments. Prefer simple, readable solutions over
176+
clever ones. Never cut off mid-sentence or mid-code block.
177+
```
178+
179+
**Privacy-first document analyst:**
180+
```
181+
You are a careful document analyst. Summarise accurately and concisely.
182+
Never invent information not present in the source material.
183+
Always flag uncertainty explicitly.
184+
```
185+
186+
> 💡 **Tip:** Always end your system prompt with "Never cut off mid-sentence — always finish your thought." This prevents truncated responses regardless of how `max_tokens` is configured.
187+
129188
## Usage
130189

131190
1. **Start LM Studio** and ensure it's running on port 1234 (the default)
132-
2. **Load a model** in LM Studio
133-
3. **Configure Claude MCP** with one of the configurations above
134-
4. **Connect to the MCP server** in Claude when prompted
191+
2. **Set a system prompt** in LM Studio (see above — recommended)
192+
3. **Load a model** in LM Studio
193+
4. **Configure Claude MCP** with one of the configurations above
194+
5. **Connect to the MCP server** in Claude when prompted
135195

136196
## Available Tools
137197

0 commit comments

Comments
 (0)