Spin up an MCP documentation server from your Fumadocs site (or any site with llms.txt) in minutes. AI agents in Claude Code, Cursor, or any MCP client get tools to search, browse, and retrieve your documentation — enabling them to write correct integration code on the first try.
Built on top of mcp-framework.
import { DocsServer, FumadocsRemoteSource } from "@mcpframework/docs";
const source = new FumadocsRemoteSource({
baseUrl: "https://docs.myapi.com",
});
const server = new DocsServer({
source,
name: "my-api-docs",
version: "1.0.0",
});
server.start();Or scaffold a project instantly:
npx create-docs-mcp my-api-docsPurpose-built for Fumadocs sites. Uses the native Orama search API for high-quality results, with automatic fallback to local text search.
import { FumadocsRemoteSource } from "@mcpframework/docs";
const source = new FumadocsRemoteSource({
baseUrl: "https://docs.myapi.com", // Required
searchEndpoint: "/api/search", // Default: "/api/search"
llmsTxtPath: "/llms.txt", // Default: "/llms.txt"
llmsFullTxtPath: "/llms-full.txt", // Default: "/llms-full.txt"
refreshInterval: 300_000, // Cache TTL in ms (default: 5 min)
headers: { // Optional custom headers
Authorization: "Bearer ...",
},
});| Option | Type | Default | Description |
|---|---|---|---|
baseUrl |
string |
required | Base URL of your Fumadocs site |
searchEndpoint |
string |
"/api/search" |
Fumadocs Orama search endpoint |
llmsTxtPath |
string |
"/llms.txt" |
Path to llms.txt index |
llmsFullTxtPath |
string |
"/llms-full.txt" |
Path to full content |
refreshInterval |
number |
300000 |
Cache TTL in milliseconds |
headers |
Record<string, string> |
undefined |
Custom HTTP headers |
Works with any documentation site that publishes llms.txt and llms-full.txt files (Fumadocs, Docusaurus with plugin, etc.). Search is performed locally via text matching.
import { LlmsTxtSource } from "@mcpframework/docs";
const source = new LlmsTxtSource({
baseUrl: "https://docs.myapi.com",
mdxPathPrefix: "/docs/", // Default: "/"
refreshInterval: 300_000,
});| Option | Type | Default | Description |
|---|---|---|---|
baseUrl |
string |
required | Base URL of your docs site |
llmsTxtPath |
string |
"/llms.txt" |
Path to llms.txt index |
llmsFullTxtPath |
string |
"/llms-full.txt" |
Path to full content |
mdxPathPrefix |
string |
"/" |
Prefix for individual page .mdx fetching |
refreshInterval |
number |
300000 |
Cache TTL in milliseconds |
headers |
Record<string, string> |
undefined |
Custom HTTP headers |
cache |
Cache |
MemoryCache |
Custom cache implementation |
The server exposes three MCP tools:
Search documentation by keyword or phrase. Returns ranked results with excerpts.
| Parameter | Type | Required | Description |
|---|---|---|---|
query |
string |
yes | Search keywords or phrase |
section |
string |
no | Filter to a specific section |
limit |
number |
no | Max results (default 10, max 25) |
Retrieve the full markdown content of a documentation page.
| Parameter | Type | Required | Description |
|---|---|---|---|
slug |
string |
yes | Page slug or URL path |
Browse the documentation tree structure to discover available content.
| Parameter | Type | Required | Description |
|---|---|---|---|
section |
string |
no | Filter to a section's children |
Your Fumadocs site needs to serve these endpoints:
/llms.txt(required) — Generated byfumadocs-core'ssource.llms().index()utility/llms-full.txt(required for search) — Generated bysource.llms().full()/api/search(optional) — Fumadocs' built-in Orama search API for better results
See the Fumadocs LLMs.txt documentation for setup instructions.
claude mcp add my-api-docs -- node /path/to/server/dist/index.jsAdd to claude_desktop_config.json:
{
"mcpServers": {
"my-api-docs": {
"command": "node",
"args": ["/path/to/server/dist/index.js"],
"env": {
"DOCS_BASE_URL": "https://docs.myapi.com"
}
}
}
}Add to your MCP settings:
{
"my-api-docs": {
"command": "node",
"args": ["/path/to/server/dist/index.js"],
"env": {
"DOCS_BASE_URL": "https://docs.myapi.com"
}
}
}Implement the DocSource interface to create adapters for any documentation backend:
import { DocSource, DocPage, DocSearchResult, DocSection } from "@mcpframework/docs";
class MyCustomSource implements DocSource {
name = "my-source";
async search(query: string, options?: { section?: string; limit?: number }) {
// Your search implementation
return [];
}
async getPage(slug: string) {
// Fetch and return a page, or null
return null;
}
async listSections() {
// Return your documentation tree
return [];
}
async getIndex() {
return ""; // llms.txt content
}
async getFullContent() {
return ""; // llms-full.txt content
}
async healthCheck() {
return { ok: true };
}
}The built-in MemoryCache provides LRU eviction with TTL expiry. You can pass a custom cache:
import { MemoryCache, LlmsTxtSource } from "@mcpframework/docs";
const cache = new MemoryCache({
maxEntries: 200, // Default: 100
ttlMs: 600_000, // Default: 300_000 (5 min)
});
const source = new LlmsTxtSource({
baseUrl: "https://docs.myapi.com",
cache,
});Or implement the Cache interface for Redis, SQLite, etc.
// Main exports
import {
DocsServer,
LlmsTxtSource,
FumadocsRemoteSource,
MemoryCache,
SearchDocsTool,
GetPageTool,
ListSectionsTool,
} from "@mcpframework/docs";
// Subpath imports
import { LlmsTxtSource, FumadocsRemoteSource } from "@mcpframework/docs/sources";
import { SearchDocsTool, GetPageTool, ListSectionsTool } from "@mcpframework/docs/tools";
import { MemoryCache } from "@mcpframework/docs/cache";MIT