Demonstrates how to wrap any MCP server with codeMcpServer so an LLM gets a single code tool instead of a long list of individual tools.
A normal MCP server with N tools floods the LLM's context and requires a separate round-trip per tool call. codeMcpServer collapses the whole server into one code tool: every upstream tool becomes a typed method on codemode.*, and the LLM can chain calls, branch on results, and do logic — all in a single code execution.
This example exposes two endpoints:
/mcp— the raw upstream server with three tools (add,greet,list_items)/codemode— the same server wrapped withcodeMcpServer; onecodetool, full typed SDK
npm install
npm startConnect an MCP client (e.g. Claude Desktop, MCP Inspector) to http://localhost:8787/codemode.
import { DynamicWorkerExecutor } from "@cloudflare/codemode";
import { codeMcpServer } from "@cloudflare/codemode/mcp";
const upstream = new McpServer({ name: "my-tools", version: "1.0.0" });
upstream.registerTool(
"add",
{ inputSchema: { a: z.number(), b: z.number() } },
handler
);
export default {
async fetch(request, env, ctx) {
const executor = new DynamicWorkerExecutor({ loader: env.LOADER });
const server = await codeMcpServer({ server: upstream, executor });
return createMcpHandler(server)(request, env, ctx);
}
};The LLM writes code like:
async () => {
const sum = await codemode.add({ a: 5, b: 3 });
const greeting = await codemode.greet({
name: "Result is " + sum.content[0].text
});
return greeting;
};wrangler.jsonc needs a worker_loaders binding for the executor:
- codemode-mcp-openapi — same pattern but driven from an OpenAPI spec
@cloudflare/codemodedocs
{ "worker_loaders": [{ "binding": "LOADER" }] }