Skip to content

Commit bfa5a57

Browse files
BrainSlugs83Copilot
andcommitted
v1.5.2: conditional instructions, fix stale port references
- MCP instructions 'Bottom line' nudge only included on warm starts (server deps already installed), omitted during cold first-run - Updated hardcoded port 31337 references to reflect per-user-alias port hashing (BASE_PORT + FNV-1a hash of username) - Bumped version to 1.5.2 Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
1 parent 9016b9c commit bfa5a57

File tree

3 files changed

+40
-26
lines changed

3 files changed

+40
-26
lines changed

index.js

Lines changed: 38 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -282,36 +282,50 @@ keepaliveTimer.unref(); // Don't block process exit for keepalive
282282
// Tools will wait for the server when called.
283283
ensureServer().catch(() => {}); // fire-and-forget; tools handle readiness
284284

285+
const warmStart = serverDepsInstalled();
286+
287+
const instructionLines = [
288+
"## Vector Memory — Usage Guide",
289+
"",
290+
"You have a `vector_search` tool that provides **semantic search across all past session history.**",
291+
"Use it proactively and aggressively — don't wait to be asked.",
292+
"",
293+
"### When to search (default to searching — it's local, free, and instant):",
294+
"- Any topic where prior sessions might have context (a project, tool, concept, problem)",
295+
"- User mentions something that implies shared history: \"remember when...\", \"didn't we...\", \"have we ever...\"",
296+
"- User annotates a word with `(r)` or `(recall)` — e.g. \"my RTX 3090(r) machine\" — treat it like a hyperlink to memory",
297+
"- Beginning of a new session — search for recent context on the current repo/directory",
298+
"- Before making assumptions about prior decisions or conventions — check memory first",
299+
"- When the user starts a task similar to something done before",
300+
"",
301+
"### How it works:",
302+
"- It's **semantic** search — query by concept, not just keywords. \"How did we handle auth\" finds results even if \"auth\" was never literally used.",
303+
"- It's stochastic — results vary slightly each call. If a search doesn't surface what you need, rephrase and try again.",
304+
"- Better to search and find nothing than to miss context that existed.",
305+
"- Use `vector_reindex` only if results seem stale — auto-indexing handles most cases.",
306+
"",
307+
"### Architecture (for troubleshooting):",
308+
"- Singleton HTTP server (per-user-alias port, one ONNX model in memory shared across all copilot instances)",
309+
"- Thin STDIO proxy per copilot instance auto-launches the server if needed",
310+
];
311+
312+
if (warmStart) {
313+
instructionLines.push(
314+
"",
315+
"### Bottom line:",
316+
"**Search early, search often.** You have virtually unlimited long-term memory at zero cost.",
317+
"If there's even a chance a prior session touched on the current topic, search before responding.",
318+
"The user expects you to leverage this — not searching when you should is a missed opportunity.",
319+
);
320+
}
321+
285322
const server = new McpServer(
286323
{
287324
name: "vector-memory",
288325
version: PKG.version,
289326
},
290327
{
291-
instructions: [
292-
"## Vector Memory — Usage Guide",
293-
"",
294-
"You have a `vector_search` tool that provides **semantic search across all past session history.**",
295-
"Use it proactively and aggressively — don't wait to be asked.",
296-
"",
297-
"### When to search (default to searching — it's local, free, and instant):",
298-
"- Any topic where prior sessions might have context (a project, tool, concept, problem)",
299-
"- User mentions something that implies shared history: \"remember when...\", \"didn't we...\", \"have we ever...\"",
300-
"- User annotates a word with `(r)` or `(recall)` — e.g. \"my RTX 3090(r) machine\" — treat it like a hyperlink to memory",
301-
"- Beginning of a new session — search for recent context on the current repo/directory",
302-
"- Before making assumptions about prior decisions or conventions — check memory first",
303-
"- When the user starts a task similar to something done before",
304-
"",
305-
"### How it works:",
306-
"- It's **semantic** search — query by concept, not just keywords. \"How did we handle auth\" finds results even if \"auth\" was never literally used.",
307-
"- It's stochastic — results vary slightly each call. If a search doesn't surface what you need, rephrase and try again.",
308-
"- Better to search and find nothing than to miss context that existed.",
309-
"- Use `vector_reindex` only if results seem stale — auto-indexing handles most cases.",
310-
"",
311-
"### Architecture (for troubleshooting):",
312-
"- Singleton HTTP server (one ONNX model in memory shared across all copilot instances)",
313-
"- Thin STDIO proxy per copilot instance auto-launches the server if needed",
314-
].join("\n"),
328+
instructions: instructionLines.join("\n"),
315329
},
316330
);
317331

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "ghcp-cli-vector-memory-mcp",
3-
"version": "1.5.1",
3+
"version": "1.5.2",
44
"description": "MCP server that gives GitHub Copilot CLI persistent long-term memory via local semantic vector search. Install: npx -y ghcp-cli-vector-memory-mcp",
55
"main": "index.js",
66
"bin": {

vector-memory-server.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -207,7 +207,7 @@ async function search(vecDb, query, limit = 10) {
207207

208208
// --- Startup (heavy init deferred until after singleton check) ---
209209

210-
// --- HTTP Server (singleton, port 31337) ---
210+
// --- HTTP Server (singleton, per-user-alias port via userPort()) ---
211211

212212
const PORT = parseInt(process.env.VECTOR_MEMORY_PORT || String(userPort(SERVER_USER)), 10);
213213

0 commit comments

Comments
 (0)