Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
248 changes: 248 additions & 0 deletions docs/mcp/sse.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,248 @@
---
title: "MCP SSE Integration"
sidebarTitle: "MCP SSE"
description: "Guide for integrating Server-Sent Events (SSE) with PraisonAI agents using MCP"
icon: "server"
---

## Add SSE Tool to AI Agent

```mermaid
flowchart LR
In[In] --> Agent[AI Agent]
Agent --> Tool[SSE MCP]
Tool --> Agent
Agent --> Out[Out]

style In fill:#8B0000,color:#fff
style Agent fill:#2E8B57,color:#fff
style Tool fill:#4169E1,color:#fff
style Out fill:#8B0000,color:#fff
```

## Quick Start

<Steps>
<Step title="Create a client file">

```python
from praisonaiagents import Agent, MCP

search_agent = Agent(
instructions="""You are a weather agent that can provide weather information for a given city.""",
llm="gpt-4o-mini",
tools=MCP("http://localhost:8080/sse")
)

search_agent.start("What is the weather in London?")
```

</Step>
<Step title="Set Up SSE MCP Server">

```python
# python mcp-sse-direct-server.py --host 127.0.0.1 --port 8080
from typing import Any
import httpx
from mcp.server.fastmcp import FastMCP
from starlette.applications import Starlette
from mcp.server.sse import SseServerTransport
from starlette.requests import Request
from starlette.routing import Mount, Route
from mcp.server import Server
import uvicorn
import argparse
import logging
import os
import inspect

# Set up logging based on environment variable
log_level = os.environ.get("LOGLEVEL", "info").upper()
logging.basicConfig(level=getattr(logging, log_level))
logger = logging.getLogger("mcp-server")

# Initialize FastMCP server for simple tools (SSE)
mcp = FastMCP("simple-tools")

@mcp.tool()
async def get_greeting(name: str) -> str:
"""Get a personalized greeting.

Args:
name: Name of the person to greet
"""
logger.debug(f"get_greeting called with name: {name}")
return f"Hello, {name}! Welcome to our MCP SSE server."

@mcp.tool()
async def get_weather(city: str) -> str:
"""Get a simulated weather report for a city.

Args:
city: Name of the city
"""
logger.debug(f"get_weather called with city: {city}")
# This is a mock implementation
weather_data = {
"Paris": "Sunny with a temperature of 22°C",
"London": "Rainy with a temperature of 15°C",
"New York": "Cloudy with a temperature of 18°C",
"Tokyo": "Clear skies with a temperature of 25°C",
"Sydney": "Partly cloudy with a temperature of 20°C"
}

return weather_data.get(city, f"Weather data not available for {city}")

def create_starlette_app(mcp_server: Server, *, debug: bool = False) -> Starlette:
"""Create a Starlette application that can serve the provided mcp server with SSE."""
sse = SseServerTransport("/messages/")

async def handle_sse(request: Request) -> None:
logger.debug(f"SSE connection request received from {request.client}")
async with sse.connect_sse(
request.scope,
request.receive,
request._send, # noqa: SLF001
) as (read_stream, write_stream):
await mcp_server.run(
read_stream,
write_stream,
mcp_server.create_initialization_options(),
)

return Starlette(
debug=debug,
routes=[
Route("/sse", endpoint=handle_sse),
Mount("/messages/", app=sse.handle_post_message),
],
)

if __name__ == "__main__":
mcp_server = mcp._mcp_server # noqa: WPS437

parser = argparse.ArgumentParser(description='Run MCP SSE-based server')
parser.add_argument('--host', default='localhost', help='Host to bind to')
parser.add_argument('--port', type=int, default=8080, help='Port to listen on')
args = parser.parse_args()

print(f"Starting MCP SSE server on {args.host}:{args.port}")

# Hardcode the tool names since we know what they are
tool_names = ["get_greeting", "get_weather"]
print(f"Available tools: {', '.join(tool_names)}")

# Bind SSE request handling to MCP server
starlette_app = create_starlette_app(mcp_server, debug=True)

uvicorn.run(starlette_app, host=args.host, port=args.port)
```
</Step>

<Step title="Install Dependencies">
Make sure you have the required packages installed:
```bash
pip install "praisonaiagents[llm]" mcp starlette uvicorn httpx
```
</Step>
<Step title="Export API Key">
```bash
export OPENAI_API_KEY="your_api_key"
```
</Step>

<Step title="Run the Server and Agent">
First, start the SSE server:
```bash
python mcp-sse-direct-server.py --host 127.0.0.1 --port 8080
```

Then, in a new terminal, run the agent:
```bash
python weather_agent.py
```
</Step>
</Steps>

<Note>
**Requirements**
- Python 3.10 or higher
- MCP server dependencies
</Note>

## Alternative LLM Integrations

### Using Groq with SSE

```python
from praisonaiagents import Agent, MCP

weather_agent = Agent(
instructions="""You are a weather agent that can provide weather information for a given city.""",
llm="groq/llama-3.2-90b-vision-preview",
tools=MCP("http://localhost:8080/sse")
)

weather_agent.start("What is the weather in London?")
```

### Using Ollama with SSE

```python
from praisonaiagents import Agent, MCP

weather_agent = Agent(
instructions="""You are a weather agent that can provide weather information for a given city.""",
llm="ollama/llama3.2",
tools=MCP("http://localhost:8080/sse")
)

weather_agent.start("What is the weather in London? Use get_weather tool, city is the required parameter.")
```

## Gradio UI Integration

Create a Gradio UI for your weather service:

```python
from praisonaiagents import Agent, MCP
import gradio as gr

def get_weather_info(query):
weather_agent = Agent(
instructions="""You are a weather agent that can provide weather information for a given city.""",
llm="gpt-4o-mini",
tools=MCP("http://localhost:8080/sse")
)

result = weather_agent.start(query)
return f"## Weather Information\n\n{result}"

demo = gr.Interface(
fn=get_weather_info,
inputs=gr.Textbox(placeholder="What's the weather in London?"),
outputs=gr.Markdown(),
title="Weather MCP Agent",
description="Ask about the weather in any major city:"
)

if __name__ == "__main__":
demo.launch()
```

## Features

<CardGroup cols={2}>
<Card title="Real-time Updates" icon="bolt">
Receive server-sent events in real-time from your AI agent.
</Card>
<Card title="Multi-Agent Support" icon="users">
Combine SSE with other MCP tools for complex workflows.
</Card>
<Card title="Multiple LLM Options" icon="brain">
Use with OpenAI, Groq, Ollama, or other supported LLMs.
</Card>
<Card title="Gradio UI" icon="window">
Create user-friendly interfaces for your SSE integrations.
</Card>
</CardGroup>
1 change: 1 addition & 0 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -238,6 +238,7 @@
"group": "MCP",
"pages": [
"mcp/airbnb",
"mcp/sse",
"mcp/ollama",
"mcp/groq",
"mcp/openrouter",
Expand Down
3 changes: 2 additions & 1 deletion src/praisonai-agents/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -84,4 +84,5 @@ build
*.mp4
*.png
graph.py
chroma_db/
chroma_db/
.qodo
83 changes: 83 additions & 0 deletions src/praisonai-agents/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
# MCP SSE Server and Client Implementation

This project demonstrates a working pattern for SSE-based MCP (Model Context Protocol) servers and clients. It consists of three main components:

1. **server.py**: An SSE-based MCP server that provides simple tools
2. **client.py**: A standalone client that connects to the server and uses its tools with Claude
3. **mcp-sse.py**: A client using praisonaiagents that connects to the server and uses its tools with OpenAI

## Tools Provided by the Server

The server implements two simple tools:

- **get_greeting**: Returns a personalized greeting for a given name
- **get_weather**: Returns simulated weather data for a given city

## Setup and Usage

### Prerequisites

Make sure you have the required packages installed:

```bash
pip install praisonaiagents mcp httpx starlette uvicorn anthropic python-dotenv
```

### Running the Server

First, start the MCP SSE server:

```bash
python server.py
```

By default, the server runs on 0.0.0.0:8080, but you can customize the host and port:

```bash
python server.py --host 127.0.0.1 --port 8081
```

### Running the Standalone Client

The standalone client uses Claude to interact with the MCP server tools:

```bash
# Set your Anthropic API key
export ANTHROPIC_API_KEY=your_api_key_here

# Run the client
python client.py http://0.0.0.0:8080/sse
```

You'll see a prompt where you can type queries for Claude to process using the MCP tools.

### Running the praisonaiagents Client

The praisonaiagents client uses OpenAI to interact with the MCP server tools:

```bash
# Set your OpenAI API key
export OPENAI_API_KEY=your_api_key_here

# Run the client
python mcp-sse.py
```

This will automatically send a query about the weather in Paris to the agent.

## How It Works

1. The server exposes MCP tools via an SSE endpoint
2. Clients connect to this endpoint and discover available tools
3. When a user makes a query, the client:
- For client.py: Uses Claude to determine which tool to call
- For mcp-sse.py: Uses OpenAI to determine which tool to call
4. The client executes the tool call against the server
5. The result is returned to the user

This pattern allows for decoupled processes where the MCP server can run independently of clients, making it suitable for cloud-native applications.

## Customizing

- To add more tools to the server, define new functions with the `@mcp.tool()` decorator in `server.py`
- To change the client's behavior, update the instructions and query in `mcp-sse.py`
Loading
Loading