A Sovereign Multi-Agent AI Argumentation Framework
Witness cutting-edge LLMs argue intelligently via real-time stream orchestration.
Nexus Debate is a sophisticated, full-stack multi-agent AI argumentation platform. It automatically orchestrates a directed acyclic graph of AI agents—pitching opposing models against each other over complex subjects—all governed by a neutral Moderator and an objective Fact-Checker.
Built with an unapologetic focus on extreme UI polish, real-time capability, and scalable intelligence, Nexus showcases exactly how microservices and LLMs can merge to produce highly autonomous, emergent reasoning.
- 🧠 Dynamic Orchestration Engine: Built on LangChain and Python, controlling 4 specialized AI roles:
Proponent,Opponent,Fact-Checker, andModerator. - ⚡ Real-Time Data Streaming: Entire debates are streamed live to the client utilizing
Server-Sent Events (SSE)Starlette protocols, eradicating LLM generation latency. - 🎨 Hyper-Premium UI/UX: Constructed with raw Tailwind CSS and Framer Motion. Features advanced glassmorphism, dynamic glow states, interactive micro-animations, and a responsive dark-mode cyber aesthetic.
- 📊 Live Analytics & Scoring: The Moderator calculates debate strength sequentially, pumping data into a dynamic time-series performance visualizer handled by Recharts.
- 🔌 Model Agnostic Infrastructure: Instantly hot-swap underlying reasoning engines between OpenAI (GPT-4o), Google (Gemini 2.0), and Groq (Llama 3/Mixtral).
Nexus separates concerns entirely via a hardened API boundary, utilizing Next.js for client delivery and FastAPI for heavy AI processing.
graph TD
User((User)) -->|Configures Models & Topic| Frontend[Next.js App Server]
Frontend -->|POST /debate/start| Backend[FastAPI Microservice]
subgraph "LangChain Orchestration Layer"
Backend -->|Initializes Context| Engine[Debate Graph]
Engine -->|Invoke Agent Event| Pool
subgraph "Autonomous Agent Pool"
Proponent[Proponent Model]
Opponent[Opponent Model]
FactChecker[Fact-Checker Model]
Moderator[Moderator Model]
end
Pool -->|SSE Yield| Backend
end
Backend -->|Stream Chunked Text| Frontend
Frontend -->|Render UI / Extract Stats| Chart[Recharts Analytics]
- Framework: React 19 + Next.js 15 (App Router)
- Styling: Tailwind CSS +
lucide-reacticons - Animations: Framer Motion
- Visualizer: Recharts
- Framework: FastAPI (Python 3.9+)
- AI Orchestration: LangChain Base
- Connections: HTTP / Server-Sent Events (SSE)
- API Handlers:
sse_starlette
The engine powering the AI requests must be initialized first.
cd backend
# Build virtual environment
python3 -m venv venv
source venv/bin/activate
# Install dense python dependencies
pip install -r requirements.txt
# Secure configuration
cp .env.example .env
# Edit .env and supply your OPENAI_API_KEY, GROQ_API_KEY, or GEMINI_API_KEY
# Launch FastAPI Server
python main.pyOpen a new terminal to start the development server for the UI.
cd frontend
# Install node modules
npm install
# Build and start development portal
npm run dev🟢 Navigate to http://localhost:3000 to enter the Arena.
- Parameter Calibration: Users access the Configuration Sidebar, dynamically binding distinct LLMs to specific debate roles (e.g., Gemini argues for Universal Basic Income, while Llama 3 argues against it).
- The Exchange:
- The Proponent streams a structured argument.
- The Opponent ingests the history and fires back a logical rebuttal.
- The Fact-Checker invisibly checks claims against its distinct truth-logic prompt layer.
- Synthesis & Metric Yield: The Moderator extracts logic signals from both sides, scoring them 1-10.
- Client Render: The Next.js frontend catches the streamed values via Regex, dynamically injecting text into the glassy chat bubbles and appending integer scores to the live-charting interface.
This repository emphasizes high-tier architectural decisions designed for scalable performance and maintainability:
- Streaming over REST: Overcoming timeout boundaries and improving perceived latency through standard Unix SSE protocols instead of heavy Websockets.
- Isolating Intelligence: Deep decoupling of the prompt engineering and LangChain graph logic inside
backend/agents/debate_engine.py. - Advanced State Management: Leveraging complex
useRefand functional React arraysuseState<Message[]>for asynchronous, race-condition-free UI rendering.
Built with passion and an eye for modern design.
Licensed under MIT.