Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
197 changes: 197 additions & 0 deletions examples/python/custom_tools/hosted_server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,197 @@
"""
Hosted MCP Server implementation for PraisonAI Agents.

This module provides a base class for creating hosted MCP servers
that can handle requests and integrate with the MCP protocol.

Note: This is an example implementation. To use it, ensure you have installed:
pip install praisonaiagents[mcp] starlette>=0.27.0
"""
Comment on lines +1 to +9
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

File location doesn't match PR objectives.

According to the PR objectives, the HostedMCPServer class should be added to mcp/hosted_server.py in the core praisonaiagents package to fix the ImportError from praisonaiagents.mcp. However, this file is placed in the examples directory.

The docstring also states "This is an example implementation," which conflicts with the intended purpose of fixing a missing import in the core package.

Should this file be moved to src/praisonai-agents/praisonaiagents/mcp/hosted_server.py as originally intended? This would properly resolve the ImportError mentioned in issue #813.


🏁 Script executed:

#!/bin/bash
echo "Verifying placement of hosted_server.py files"
[ -f examples/python/custom_tools/hosted_server.py ] && echo "✔ examples/python/custom_tools/hosted_server.py exists" || echo "✖ examples/python/custom_tools/hosted_server.py missing"
[ -f src/praisonai-agents/praisonaiagents/mcp/hosted_server.py ] && echo "✔ src/praisonai-agents/praisonaiagents/mcp/hosted_server.py exists" || echo "✖ src/praisonai-agents/praisonaiagents/mcp/hosted_server.py missing"

Length of output: 494


Action Required: Move Hosted MCP Server to Core Package & Clean Up Code

Please address the following in hosted_server.py:

  • Relocate the file from
    examples/python/custom_tools/hosted_server.py
    to
    src/praisonai-agents/praisonaiagents/mcp/hosted_server.py
    so that praisonaiagents.mcp can import it directly and resolve the existing ImportError.

  • Update the module docstring to remove “example implementation” language and describe it as the core HostedMCPServer.

  • Remove the unused import json on line 14.

  • Chain the ImportError in your dependency‐check block for better tracebacks:

    --- old/examples/python/custom_tools/hosted_server.py
    +++ new/src/praisonai-agents/praisonaiagents/mcp/hosted_server.py
     try:
         from praisonaiagents.mcp.protocol import MCPProtocol
         from starlette.applications import Starlette
    -except ImportError:
    -    raise RuntimeError("Missing MCP or Starlette dependencies")
    +except ImportError as e:
    +    raise RuntimeError("Missing MCP or Starlette dependencies") from e
  • Adjust any import paths or references in your test/examples to point to the new location.

These changes will ensure the core package exports the server implementation correctly and improve code quality.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"""
Hosted MCP Server implementation for PraisonAI Agents.
This module provides a base class for creating hosted MCP servers
that can handle requests and integrate with the MCP protocol.
Note: This is an example implementation. To use it, ensure you have installed:
pip install praisonaiagents[mcp] starlette>=0.27.0
"""
try:
from praisonaiagents.mcp.protocol import MCPProtocol
from starlette.applications import Starlette
except ImportError as e:
raise RuntimeError("Missing MCP or Starlette dependencies") from e


import asyncio
import logging
from typing import Dict, Any, Optional, List, Callable
import json
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Remove unused import.

The json module is imported but never used in this file.

Remove the unused import:

-import json
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import json
🧰 Tools
🪛 Ruff (0.11.9)

14-14: json imported but unused

Remove unused import: json

(F401)

🤖 Prompt for AI Agents
In examples/python/custom_tools/hosted_server.py at line 14, the json module is
imported but not used anywhere in the file. Remove the line that imports json to
clean up the code and avoid unnecessary imports.


try:
from mcp.server.fastmcp import FastMCP
from mcp.server import Server
from starlette.applications import Starlette
from starlette.requests import Request
from starlette.routing import Mount, Route
from mcp.server.sse import SseServerTransport
import uvicorn
except ImportError:
raise ImportError(
"MCP server dependencies not installed. "
"Please install with: pip install praisonaiagents[mcp] starlette>=0.27.0"
)

logger = logging.getLogger(__name__)


class HostedMCPServer:
"""
Base class for creating hosted MCP servers.

This class provides a foundation for building MCP servers that can:
- Handle incoming requests
- Define custom tools
- Support SSE transport
- Be extended with custom functionality like latency tracking
"""

def __init__(self, name: str = "hosted-mcp-server", host: str = "localhost", port: int = 8080):
"""
Initialize the hosted MCP server.

Args:
name: Server name for identification
host: Host to bind to (default: localhost)
port: Port to listen on (default: 8080)
"""
self.name = name
self.host = host
self.port = port
self.mcp = FastMCP(name)
self._tools: Dict[str, Callable] = {}
self._server: Optional[Server] = None
self._app: Optional[Starlette] = None

def handle_request(self, request_data: Dict[str, Any]) -> Dict[str, Any]:
"""
Handle incoming MCP requests.

This method can be overridden in subclasses to add custom request handling,
such as latency tracking, authentication, or request modification.

Args:
request_data: The incoming request data

Returns:
Response data
"""
# Default implementation - can be overridden
method = request_data.get('method', '')
request_id = request_data.get('id', 'unknown')

logger.debug(f"Handling request {request_id}: {method}")

# Basic response structure
response = {
'id': request_id,
'jsonrpc': '2.0',
'result': {}
}

return response

def add_tool(self, func: Callable, name: Optional[str] = None, description: Optional[str] = None):
"""
Add a tool to the MCP server.

Args:
func: The function to expose as a tool
name: Optional name for the tool (defaults to function name)
description: Optional description for the tool
"""
tool_name = name or func.__name__

# Register with FastMCP
if asyncio.iscoroutinefunction(func):
# Already async
self.mcp.tool(name=tool_name)(func)
else:
# Wrap sync function in async
async def async_wrapper(*args, **kwargs):
return func(*args, **kwargs)
async_wrapper.__name__ = func.__name__
async_wrapper.__doc__ = description or func.__doc__
self.mcp.tool(name=tool_name)(async_wrapper)

self._tools[tool_name] = func
logger.info(f"Added tool: {tool_name}")

def create_app(self, debug: bool = False) -> Starlette:
"""
Create a Starlette application for serving the MCP server.

Args:
debug: Enable debug mode

Returns:
Starlette application instance

Raises:
RuntimeError: If the MCP server is not properly initialized
"""
if not self._server:
if not hasattr(self.mcp, '_mcp_server'):
raise RuntimeError("MCP server not properly initialized. Ensure FastMCP is correctly set up.")
self._server = self.mcp._mcp_server

sse = SseServerTransport("/messages/")

async def handle_sse(request: Request) -> None:
logger.debug(f"SSE connection from {request.client}")
async with sse.connect_sse(
request.scope,
request.receive,
request._send,
) as (read_stream, write_stream):
await self._server.run(
read_stream,
write_stream,
self._server.create_initialization_options(),
)

self._app = Starlette(
debug=debug,
routes=[
Route("/sse", endpoint=handle_sse),
Mount("/messages/", app=sse.handle_post_message),
],
)

return self._app

def start(self, debug: bool = False, **uvicorn_kwargs):
"""
Start the MCP server.

Args:
debug: Enable debug mode
**uvicorn_kwargs: Additional arguments to pass to uvicorn
"""
app = self.create_app(debug=debug)

print(f"Starting {self.name} MCP server on {self.host}:{self.port}")
print(f"Available tools: {', '.join(self._tools.keys())}")
print(f"SSE endpoint: http://{self.host}:{self.port}/sse")

uvicorn.run(app, host=self.host, port=self.port, **uvicorn_kwargs)

async def start_async(self, debug: bool = False):
"""
Start the MCP server asynchronously.

Args:
debug: Enable debug mode
"""
app = self.create_app(debug=debug)

config = uvicorn.Config(app, host=self.host, port=self.port)
server = uvicorn.Server(config)

print(f"Starting {self.name} MCP server on {self.host}:{self.port}")
print(f"Available tools: {', '.join(self._tools.keys())}")

await server.serve()

def get_tools(self) -> List[str]:
"""Get list of available tool names."""
return list(self._tools.keys())

def get_endpoint(self) -> str:
"""Get the SSE endpoint URL."""
return f"http://{self.host}:{self.port}/sse"
2 changes: 1 addition & 1 deletion examples/python/custom_tools/mcp_server_latency_example.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"""

from praisonaiagents import Agent, PraisonAIAgents
from praisonaiagents.mcp import HostedMCPServer
from hosted_server import HostedMCPServer # Import from local file
from latency_tracker_tool import tracker, get_latency_metrics
import json

Expand Down
4 changes: 2 additions & 2 deletions src/praisonai-agents/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "praisonaiagents"
version = "0.0.122"
version = "0.0.123"
description = "Praison AI agents for completing complex tasks with Self Reflection Agents"
requires-python = ">=3.10"
authors = [
Expand Down Expand Up @@ -73,4 +73,4 @@ all = [

[tool.setuptools.packages.find]
where = ["."]
include = ["praisonaiagents*"]
include = ["praisonaiagents*"]
Loading