Skip to content

Commit edf7231

Browse files
MCP docs (#1118)
* mcp mvp * precommit fix * docker added, thanks dwij * first gloss docs * added docker docs * deleted mcp (diff repo) * docs
1 parent 6e97240 commit edf7231

File tree

3 files changed

+93
-4
lines changed

3 files changed

+93
-4
lines changed

docs/mint.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -178,6 +178,7 @@
178178
"v2/usage/dashboard-info",
179179
"v2/usage/sdk-reference",
180180
"v2/usage/typescript-sdk",
181+
"v2/usage/mcp-server",
181182
"v2/usage/advanced-configuration",
182183
"v2/usage/context-managers",
183184
"v2/usage/tracking-llm-calls",

docs/v2/examples/openai.mdx

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: 'OpenAI'
3-
description: 'Load the dataset (ensure you're logged in with huggingface-cli if needed)'
3+
description: "Load the dataset (ensure you're logged in with huggingface-cli if needed)"
44
---
55
{/* SOURCE_FILE: examples/openai/multi_tool_orchestration.ipynb */}
66

@@ -322,7 +322,7 @@ Finally, the tool call and its output are appended to the conversation, and the
322322

323323
### Multi-tool orchestration flow
324324

325-
Now let us try to modify the input query and the system instructions to the responses API in order to follow a tool calling sequence and generate the output.
325+
Now let us try to modify the input query and the system instructions to the responses API in order to follow a tool calling sequence and generate the output.
326326

327327

328328
```python
@@ -427,10 +427,10 @@ agentops.end_trace(tracer, end_state="Success")
427427
```
428428

429429

430-
Here, we have seen how to utilize OpenAI's Responses API to implement a Retrieval-Augmented Generation (RAG) approach with multi-tool calling capabilities. It showcases an example where the model selects the appropriate tool based on the input query: general questions may be handled by built-in tools such as web-search, while specific medical inquiries related to internal knowledge are addressed by retrieving context from a vector database (such as Pinecone) via function calls. Additonally, we have showcased how multiple tool calls can be sequentially combined to generate a final response based on our instructions provided to responses API. Happy coding!
430+
Here, we have seen how to utilize OpenAI's Responses API to implement a Retrieval-Augmented Generation (RAG) approach with multi-tool calling capabilities. It showcases an example where the model selects the appropriate tool based on the input query: general questions may be handled by built-in tools such as web-search, while specific medical inquiries related to internal knowledge are addressed by retrieving context from a vector database (such as Pinecone) via function calls. Additonally, we have showcased how multiple tool calls can be sequentially combined to generate a final response based on our instructions provided to responses API. Happy coding!
431431

432432

433433
<script type="module" src="/scripts/github_stars.js"></script>
434434
<script type="module" src="/scripts/scroll-img-fadein-animation.js"></script>
435435
<script type="module" src="/scripts/button_heartbeat_animation.js"></script>
436-
<script type="module" src="/scripts/adjust_api_dynamically.js"></script>
436+
<script type="module" src="/scripts/adjust_api_dynamically.js"></script>

docs/v2/usage/mcp-server.mdx

Lines changed: 88 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,88 @@
1+
---
2+
title: "MCP Server"
3+
description: "MCP server for accessing AgentOps trace and span data"
4+
---
5+
6+
<iframe
7+
width="100%"
8+
height="300"
9+
src="https://www.youtube.com/embed/lTa3Sk8C4f0?si=3r7GO8N1Csh0P9C5RR"
10+
title="AgentOps MCP Server"
11+
allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture"
12+
allowFullScreen
13+
></iframe>
14+
15+
# MCP Server
16+
17+
AgentOps provides a [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) server that exposes the Public API as a set of tools for AI assistants. This allows AI models to directly query your AgentOps data during conversations and debug AI agents with greater context.
18+
19+
### Configuration & Installation
20+
21+
Add the AgentOps MCP to your MCP client's configuration file.
22+
23+
**npx configuration:**
24+
```json
25+
{
26+
"mcpServers": {
27+
"agentops": {
28+
"command": "npx",
29+
"args": [
30+
"agentops-mcp"
31+
],
32+
"env": {
33+
"AGENTOPS_API_KEY": ""
34+
}
35+
}
36+
}
37+
}
38+
```
39+
40+
**Cursor Deeplink:**
41+
42+
Add the AgentOps MCP to Cursor with Deeplink.
43+
44+
[![Install MCP Server](https://cursor.com/deeplink/mcp-install-dark.svg)](https://cursor.com/install-mcp?name=agentops&config=eyJjb21tYW5kIjoibnB4IGFnZW50b3BzLW1jcCIsImVudiI6eyJBR0VOVE9QU19BUElfS0VZIjoiIn19)
45+
46+
**Smithery:**
47+
48+
To install agentops-mcp for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@AgentOps-AI/agentops-mcp):
49+
50+
```bash
51+
npx -y @smithery/cli install @AgentOps-AI/agentops-mcp --client claude
52+
```
53+
54+
### Available Tools
55+
56+
The MCP server exposes the following tools that mirror the Public API endpoints:
57+
58+
#### `auth`
59+
Authorize using an AgentOps project API key.
60+
- **Parameters**: `api_key` (string) - Your AgentOps project API key
61+
- **Usage**: The server will automatically prompt for authentication when needed
62+
63+
#### `get_project`
64+
Get details about the current project.
65+
- **Parameters**: None
66+
- **Returns**: Project information including ID, name, and environment
67+
68+
#### `get_trace`
69+
Get trace information by ID.
70+
- **Parameters**: `trace_id` (string) - The trace identifier
71+
- **Returns**: Trace details and metrics
72+
73+
#### `get_span`
74+
Get span information by ID.
75+
- **Parameters**: `span_id` (string) - The span identifier
76+
- **Returns**: Span attributes and metrics
77+
78+
#### `get_complete_trace`
79+
Get complete trace information by ID.
80+
- **Parameters**: `span_id` (string) - The trace identifier
81+
- **Returns**: Complete trace and associated span details.
82+
83+
### Environment Variables
84+
85+
The MCP server supports the following environment variables:
86+
87+
- `AGENTOPS_API_KEY`: Your AgentOps project API key
88+
- `HOST`: API endpoint (defaults to `https://api.agentops.ai`)

0 commit comments

Comments
 (0)