Skip to content

Commit 47a0666

Browse files
Refactor Workers prompting page to highlight MCP servers (#27475)
* refactor page * Fix style guide issues in Workers prompting page --------- Co-authored-by: opencode-agent[bot] <opencode-agent[bot]@users.noreply.github.com>
1 parent db4c329 commit 47a0666

1 file changed

Lines changed: 57 additions & 94 deletions

File tree

Lines changed: 57 additions & 94 deletions
Original file line numberDiff line numberDiff line change
@@ -1,138 +1,101 @@
11
---
22
title: Prompting
33
pcx_content_type: concept
4+
description: Build Workers apps with AI prompts and MCP servers.
45
tags:
56
- AI
67
- LLM
78
sidebar:
89
order: 3
910
---
1011

11-
import { Tabs, TabItem, GlossaryTooltip, Type, Badge, TypeScriptExample } from "~/components";
1212
import { Code } from "@astrojs/starlight/components";
13-
import BasePrompt from '~/content/partials/prompts/base-prompt.txt?raw';
13+
import BasePrompt from "~/content/partials/prompts/base-prompt.txt?raw";
1414

15-
One of the fastest ways to build an application is by using AI to assist with writing the boiler plate code. When building, iterating on or debugging applications using AI tools and Large Language Models (LLMs), a well-structured and extensive prompt helps provide the model with clearer guidelines & examples that can dramatically improve output.
15+
You can create Workers applications from simple prompts in your favorite agent or editor, including Cursor, Windsurf, VS Code, Claude Code, Codex, and OpenCode.
1616

17-
Below is an extensive example prompt that can help you build applications using Cloudflare Workers and your preferred AI model.
17+
## Teach your agent about Workers
1818

19-
### Build Workers using a prompt
19+
Connect the [`cloudflare-docs`](https://github.com/cloudflare/mcp-server-cloudflare/tree/main/apps/docs-vectorize) MCP (Model Context Protocol) server to teach your agent about Workers. Add the server URL `https://docs.mcp.cloudflare.com/mcp` to your agent configuration ([learn more](/agents/model-context-protocol/mcp-servers-for-cloudflare/)).
2020

21-
To use the prompt:
21+
You can also connect the [`cloudflare-observability`](https://github.com/cloudflare/mcp-server-cloudflare/tree/main/apps/workers-observability) MCP server (`https://observability.mcp.cloudflare.com/mcp`). This helps your agent check logs, look for exceptions, and automatically fix issues.
2222

23-
1. Use the click-to-copy button at the top right of the code block below to copy the full prompt to your clipboard
24-
2. Paste into your AI tool of choice (for example OpenAI's ChatGPT or Anthropic's Claude)
25-
3. Make sure to enter your part of the prompt at the end between the `<user_prompt>` and `</user_prompt>` tags.
23+
## Example prompts
24+
25+
```txt
26+
Create a Cloudflare Workers application that serves as a backend API server.
27+
```
28+
29+
```txt
30+
Show me how to use Hyperdrive to connect my Worker to an existing Postgres database.
31+
```
32+
33+
```txt
34+
Create an AI chat Agent using the Cloudflare Agents SDK that responds to user messages and maintains conversation history.
35+
```
36+
37+
```txt
38+
Build a WebSocket-based pub/sub application using Durable Objects Hibernation APIs, where the server allows me to POST to /send-message with {topic: "foo", message: "bar"} and delivers that message to any connected client listening to that topic.
39+
```
40+
41+
```txt
42+
Build an image upload application using R2 pre-signed URLs that allows users to securely upload images directly to object storage without exposing bucket credentials.
43+
```
44+
45+
## Use a prompt
46+
47+
You can use the base prompt below to provide your AI tool with context about Workers APIs and best practices.
48+
49+
1. Use the click-to-copy button at the top right of the code block below to copy the full prompt to your clipboard.
50+
2. Paste into your AI tool of choice (for example OpenAI's ChatGPT or Anthropic's Claude).
51+
3. Enter your part of the prompt at the end between the `<user_prompt>` and `</user_prompt>` tags.
2652

2753
Base prompt:
54+
2855
<Code code={BasePrompt} collapse={"30-10000"} lang="md" />
2956

3057
The prompt above adopts several best practices, including:
3158

32-
* Using `<xml>` tags to structure the prompt
33-
* API and usage examples for products and use-cases
34-
* Guidance on how to generate configuration (e.g. `wrangler.jsonc`) as part of the models response.
35-
* Recommendations on Cloudflare products to use for specific storage or state needs
59+
- Using `<xml>` tags to structure the prompt
60+
- API and usage examples for products and use cases
61+
- Guidance on how to generate configuration (for example, `wrangler.jsonc`) as part of the model's response
62+
- Recommendations on Cloudflare products to use for specific storage or state needs
3663

3764
### Additional uses
3865

3966
You can use the prompt in several ways:
4067

41-
* Within the user context window, with your own user prompt inserted between the `<user_prompt>` tags (**easiest**)
42-
* As the `system` prompt for models that support system prompts
43-
* Adding it to the prompt library and/or file context within your preferred IDE:
44-
* Cursor: add the prompt to [your Project Rules](https://docs.cursor.com/context/rules-for-ai)
45-
* Zed: use [the `/file` command](https://zed.dev/docs/assistant/assistant-panel) to add the prompt to the Assistant context.
46-
* Windsurf: use [the `@-mention` command](https://docs.codeium.com/chat/overview) to include a file containing the prompt to your Chat.
47-
* Claude Code: add the prompt to your CLAUDE.md configuration after running `/init` to include best practices to a Workers project.
48-
* GitHub Copilot: create the [`.github/copilot-instructions.md`](https://docs.github.com/en/copilot/customizing-copilot/adding-repository-custom-instructions-for-github-copilot) file at the root of your project and add the prompt.
68+
- Within the user context window, with your own user prompt inserted between the `<user_prompt>` tags (**easiest**)
69+
- As the `system` prompt for models that support system prompts
70+
- Adding it to the prompt library or file context in your preferred IDE:
71+
- Cursor: add the prompt to [your Project Rules](https://docs.cursor.com/context/rules-for-ai)
72+
- Zed: use [the `/file` command](https://zed.dev/docs/assistant/assistant-panel) to add the prompt to the Assistant context
73+
- Windsurf: use [the `@-mention` command](https://docs.codeium.com/chat/overview) to include a file containing the prompt to your Chat
74+
- Claude Code: add the prompt to your `CLAUDE.md` configuration after running `/init` to include best practices to a Workers project
75+
- GitHub Copilot: create the [`.github/copilot-instructions.md`](https://docs.github.com/en/copilot/customizing-copilot/adding-repository-custom-instructions-for-github-copilot) file at the root of your project and add the prompt
4976

5077
:::note
5178

52-
The prompt(s) here are examples and should be adapted to your specific use case. We'll continue to build out the prompts available here, including additional prompts for specific products.
79+
The prompts here are examples and should be adapted to your specific use case.
5380

54-
Depending on the model and user prompt, it may generate invalid code, configuration or other errors, and we recommend reviewing and testing the generated code before deploying it.
81+
Depending on the model and user prompt, it may generate invalid code, configuration, or other errors. Review and test the generated code before deploying it.
5582

5683
:::
5784

58-
### Passing a system prompt
59-
60-
If you are building an AI application that will itself generate code, you can additionally use the prompt above as a "system prompt", which will give the LLM additional information on how to structure the output code. For example:
61-
62-
<TypeScriptExample filename="index.ts">
63-
64-
```ts
65-
import workersPrompt from "./workersPrompt.md"
66-
67-
// Llama 3.3 from Workers AI
68-
const PREFERRED_MODEL = "@cf/meta/llama-3.3-70b-instruct-fp8-fast"
69-
70-
export default {
71-
async fetch(req: Request, env: Env, ctx: ExecutionContext) {
72-
const openai = new OpenAI({
73-
apiKey: env.WORKERS_AI_API_KEY
74-
});
75-
76-
const stream = await openai.chat.completions.create({
77-
messages: [
78-
{
79-
role: "system",
80-
content: workersPrompt,
81-
},
82-
{
83-
role: "user",
84-
// Imagine something big!
85-
content: "Build an AI Agent using Workflows. The Workflow should be triggered by a GitHub webhook on a pull request, and ..."
86-
}
87-
],
88-
model: PREFERRED_MODEL,
89-
stream: true,
90-
});
91-
92-
// Stream the response so we're not buffering the entire response in memory,
93-
// since it could be very large.
94-
const transformStream = new TransformStream();
95-
const writer = transformStream.writable.getWriter();
96-
const encoder = new TextEncoder();
97-
98-
(async () => {
99-
try {
100-
for await (const chunk of stream) {
101-
const content = chunk.choices[0]?.delta?.content || '';
102-
await writer.write(encoder.encode(content));
103-
}
104-
} finally {
105-
await writer.close();
106-
}
107-
})();
108-
109-
return new Response(transformStream.readable, {
110-
headers: {
111-
'Content-Type': 'text/plain; charset=utf-8',
112-
'Transfer-Encoding': 'chunked'
113-
}
114-
});
115-
}
116-
}
117-
118-
```
119-
120-
</TypeScriptExample>
121-
12285
## Use docs in your editor
12386

12487
AI-enabled editors, including Cursor and Windsurf, can index documentation. Cursor includes the Cloudflare Developer Docs by default: you can use the [`@Docs`](https://cursor.com/docs/context/mentions#docs) command.
12588

126-
In other editors, such as Zed or Windsurf, you can paste in URLs to add to your context. Use the _Copy Page_ button to paste in Cloudflare docs directly, or fetch docs for each product by appending `llms-full.txt` to the root URL - for example, `https://developers.cloudflare.com/agents/llms-full.txt` or `https://developers.cloudflare.com/workflows/llms-full.txt`.
89+
In other editors, such as Zed or Windsurf, you can paste in URLs to add to your context. Use the _Copy Page_ button to paste in Cloudflare docs directly, or fetch docs for each product by appending `llms-full.txt` to the root URL. For example, `https://developers.cloudflare.com/agents/llms-full.txt` or `https://developers.cloudflare.com/workflows/llms-full.txt`.
12790

12891
You can combine these with the Workers system prompt on this page to improve your editor or agent's understanding of the Workers APIs.
12992

13093
## Additional resources
13194

132-
To get the most out of AI models and tools, we recommend reading the following guides on prompt engineering and structure:
95+
To get the most out of AI models and tools, review the following guides on prompt engineering and structure:
13396

134-
* OpenAI's [prompt engineering](https://platform.openai.com/docs/guides/prompt-engineering) guide and [best practices](https://platform.openai.com/docs/guides/reasoning-best-practices) for using reasoning models.
135-
* The [prompt engineering](https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview) guide from Anthropic
136-
* Google's [quick start guide](https://services.google.com/fh/files/misc/gemini-for-google-workspace-prompting-guide-101.pdf) for writing effective prompts
137-
* Meta's [prompting documentation](https://www.llama.com/docs/how-to-guides/prompting/) for their Llama model family.
138-
* GitHub's guide for [prompt engineering](https://docs.github.com/en/copilot/using-github-copilot/copilot-chat/prompt-engineering-for-copilot-chat) when using Copilot Chat.
97+
- OpenAI's [prompt engineering](https://platform.openai.com/docs/guides/prompt-engineering) guide and [best practices](https://platform.openai.com/docs/guides/reasoning-best-practices) for using reasoning models.
98+
- The [prompt engineering](https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview) guide from Anthropic.
99+
- Google's [quick start guide](https://services.google.com/fh/files/misc/gemini-for-google-workspace-prompting-guide-101.pdf) for writing effective prompts.
100+
- Meta's [prompting documentation](https://www.llama.com/docs/how-to-guides/prompting/) for their Llama model family.
101+
- GitHub's guide for [prompt engineering](https://docs.github.com/en/copilot/using-github-copilot/copilot-chat/prompt-engineering-for-copilot-chat) when using Copilot Chat.

0 commit comments

Comments
 (0)