Start Here (2 minutes)
- Prereqs: Docker Desktop running (verify with
docker info);makeinstalled (see ../setup/makefile-essentials.md)- Commands:
# from this folder [ -f .env ] || cp .env.example .env # add keys if needed make up # starts backend + frontend- Open: http://localhost:3000 (frontend) and http://localhost:8000 (backend)
- Common targets:
make logs,make test-backend,make test-frontend,make down
Professional content creation using CrewAI 0.177.0 + FastAPI + TypeScript
This project demonstrates a production-ready multi-agent AI content generation system where 3 specialized AI agents work together to create high-quality content:
- π Research Agent: Gathers comprehensive information and data
- π― Strategy Agent: Develops content framework and structure
- π Writer Agent: Creates engaging, professional content
Perfect for learning: Multi-agent coordination, web development, API design, and professional deployment patterns.
| Step | π€ AI-Powered (Recommended) | π» Manual Command Line |
|---|---|---|
| 1. Clone the Repo | Tell your agent: βClone the multi-ai-coding-agent repo and open Project 1.β | git clone https://github.com/pingwu/multi-ai-coding-agent.gitcd multi-ai-coding-agent/project-01-content-generator |
| 2. Launch Project | βBring up Project 1β or βStart the content generator project.β | From repo root: make -C project-01-content-generator upFrom this folder: make up |
| 3. Configure API Keys | βHelp me set up my API keys.β | `[ -f .env ] |
| Result | App at http://localhost:3000 (UI) and http://localhost:8000 (API). |
App at http://localhost:3000 (UI) and http://localhost:8000 (API). |
- Docker Desktop installed and running
- OpenAI API Key for AI models (Get it here)
- Serper API Key for web search (FREE - Get it here)
- Git for cloning
- 8GB+ RAM recommended
git clone https://github.com/pingwu/multi-ai-coding-agent.git
cd multi-ai-coding-agent/project-01-content-generator
# Copy environment template (skip if .env already exists)
[ -f .env ] || cp .env.example .env
# Add your API keys to .env file (or edit with your editor)
echo "OPENAI_API_KEY=your-openai-key-here" >> .env
echo "SERPER_API_KEY=your-serper-key-here" >> .env# Recommended (Makefile)
make up
# Alternative (Compose v2)
docker compose up --build
# Backend: http://localhost:8000 (docs at /docs)
# Frontend: http://localhost:3000# Test with curl
curl -X POST "http://localhost:8000/api/generate" \
-H "Content-Type: application/json" \
-d '{"topic": "AI in Healthcare 2025"}'Open your browser to http://localhost:3000 to use the web interface for content generation.
π Success! You now have a professional AI content generation system running locally.
User Input (Topic)
β
π Research Agent β Uses web search to gather real-time information, statistics, trends
β
π― Strategy Agent β Creates content strategy and structure
β
π Writer Agent β Generates final polished content
β
Generated Content (Markdown)
Backend:
- CrewAI 0.177.0: Multi-agent orchestration framework
- SerperDevTool: Free web search API integration
- FastAPI: Modern Python web framework with async support
- Uvicorn: ASGI server for production deployment
- WebSockets: Real-time console output streaming
- Pydantic: Data validation and serialization
Frontend:
- React 18: Modern JavaScript UI library
- TypeScript: Type-safe JavaScript for better development
- Real-time Updates: WebSocket integration for live progress
- Responsive Design: Works on desktop and mobile
DevOps:
- Docker: Containerized deployment
- Docker Compose: Multi-service orchestration
- Environment Management: Secure API key handling
content-generator/
βββ π README.md # This file
βββ π³ docker-compose.yml # Container orchestration
βββ π³ Dockerfile # Python backend container
βββ βοΈ pyproject.toml # Python dependencies (CrewAI 0.177.0)
βββ π .env.example # Environment template
β
βββ π src/my_mas/ # CrewAI backend
β βββ π€ crew.py # Agent and task definitions
β βββ π web_api.py # FastAPI web server
β βββ π main.py # CLI entry point
β βββ π config/ # YAML configurations
β βββ agents.yaml # Agent definitions
β βββ tasks.yaml # Task definitions
β
βββ π frontend/ # React TypeScript UI
β βββ π¦ package.json # Node.js dependencies
β βββ π src/
β β βββ π¨ App.tsx # Main React application
β β βββ π
App.css # Styling
β β βββ π components/ # React components
β β βββ ContentForm.tsx # Input form
β β βββ LiveConsole.tsx # Real-time output
β β βββ ResultsDisplay.tsx # Generated content
β βββ π³ Dockerfile # Frontend container
β
βββ π generated_content/ # Output directory
For Course Participants: Learning Objectives
- Understand CrewAI agent coordination patterns
- Learn agent role definition and task assignment
- Master Docker-based development workflow
- Build professional API endpoints with FastAPI
- Create TypeScript React applications
- Implement real-time WebSocket communication
- Design responsive, professional user interfaces
- Handle asynchronous API interactions
- Multi-Agent AI Systems: Design and coordinate specialized AI agents
- Full-Stack Development: Python backend + TypeScript frontend
- API Design: RESTful services with WebSocket real-time updates
- Professional Deployment: Docker containerization and orchestration
- Production Patterns: Error handling, logging, and monitoring
- Change Agent Personalities: Edit
config/agents.yaml - Modify Tasks: Update
config/tasks.yaml - Add New Topics: Use the web interface
- Adjust Styling: Modify
frontend/src/App.css
# Add a new agent to crew.py
@agent
def editor(self) -> Agent:
return Agent(
config=self.agents_config['editor'],
verbose=True
)- Add New Tools: Integrate web search, databases, APIs
- Custom Processing: Add specialized content types
- Advanced UI: Add drag-and-drop, file uploads, previews
- Integration: Connect to CMS, social media, email systems
POST /api/generate
Content-Type: application/json
{
"topic": "AI in Healthcare this year",
"agents": { ... }, # Optional agent customization
"tasks": { ... } # Optional task customization
}
Response: {
"job_id": "uuid",
"status": "pending"
}GET /api/status/{job_id}
Response: {
"job_id": "uuid",
"status": "completed|running|error",
"created_at": "2026-01-15T10:30:00Z",
"result": "Generated content..."
}GET /api/result/{job_id}
Response: {
"job_id": "uuid",
"topic": "AI in Healthcare 2026",
"result": "# AI in Healthcare 2026\n\n..."
}// WebSocket connection for live updates
const ws = new WebSocket('ws://localhost:8000/ws/console/{job_id}');
ws.onmessage = (event) => {
const message = JSON.parse(event.data);
console.log(message.message); // Agent progress updates
};# Full stack (backend + frontend) - DEFAULT
docker compose up --build
# Backend only (if you only need the API)
docker compose up content-generator
# Frontend only (requires backend running separately)
docker compose up frontend# Build production images
docker compose -f docker-compose.prod.yml build
# Deploy to cloud (Google Cloud Run example)
gcloud run deploy content-generator \
--source . \
--region us-central1 \
--set-env-vars OPENAI_API_KEY=your-key# Required
OPENAI_API_KEY=your-openai-api-key # Get from https://platform.openai.com/api-keys
SERPER_API_KEY=your-serper-api-key # Get FREE key from https://serper.dev/
# Optional
ANTHROPIC_API_KEY=your-anthropic-key # Alternative AI model
DEBUG=true # Development mode
CORS_ORIGINS=http://localhost:3000 # Frontend URLProblem: ImportError: No module named 'crewai'
Solution: Ensure you're using the latest version:
pip install --upgrade crewai[tools]>=0.177.0Problem: WebSocket connection failed
Solution: Check that backend is running on port 8000:
curl http://localhost:8000/ # Should return API infoProblem: Agent execution timeout or Search failed
Solution: Check API key configuration:
# Verify .env file contains valid keys
cat .env | grep OPENAI_API_KEY
cat .env | grep SERPER_API_KEYProblem: SerperDevTool authentication failed
Solution: Get a FREE Serper API key:
- Visit https://serper.dev/
- Sign up for free account (100 searches/month)
- Copy your API key
- Add to your .env file:
SERPER_API_KEY=your-key-here
Problem: Docker build fails
Solution: Clean Docker cache and rebuild:
docker system prune -f
docker compose build --no-cache- API Keys: Use efficient AI models (GPT-4 vs GPT-3.5-turbo)
- Caching: Implement Redis for agent results
- Scaling: Use multiple worker processes with Gunicorn
- Monitoring: Add logging and metrics collection
-
Asynchronous API: The content generation process is asynchronous.
- A
POSTrequest to/api/generatestarts the job and returns ajob_id. - The status can be monitored via
docker compose logs. - The final article must be retrieved with a
GETrequest to/api/result/{job_id}.
- A
-
Content Storage: The generated content is not automatically saved to the filesystem in the
generated_content/directory. It is only available via the API endpoint. The content was manually saved togenerated_content/AI_in_Healthcare_2025.mdafter being retrieved from the API. -
Initial Build Time: The first time you run
docker compose up --build, it can take a significant amount of time (several minutes) to download and install all the Python dependencies. Subsequent builds are much faster. -
Environment Variables: Ensure your
.envfile is correctly set up with validOPENAI_API_KEYandSERPER_API_KEYbefore starting the services.
- Official CrewAI Docs - Complete framework documentation
- Agent Configuration - How to define agents
- Task Management - Task creation and coordination
- FastAPI Tutorial - Web framework fundamentals
- WebSocket Guide - Real-time communication
- React TypeScript Guide - Frontend development
- WebSocket Client - Real-time frontend
- Docker Compose Guide - Multi-container deployment
- Production Deployment - Scaling and monitoring
- Customize Agents: Modify roles and tasks for your use case
- Test Different Topics: Explore various content generation scenarios
- UI Enhancements: Add new features to the frontend interface
- Multi-Format Output: Generate blogs, social media, emails
- Content Templates: Save and reuse successful configurations
- Integration: Connect to content management systems
- A/B Testing: Compare different agent approaches
- User Authentication: Multi-user content generation
- Content Scheduling: Automated publishing workflows
- Analytics: Track content performance and optimization
- Enterprise Integration: API keys, team management, billing
By completing this project, you will have:
- β Built a professional multi-agent AI system using CrewAI 0.177.0
- β Created a full-stack web application with Python + TypeScript
- β Implemented real-time communication using WebSockets
- β Deployed using industry-standard tools (Docker, containerization)
- β Gained experience with production patterns (APIs, error handling, logging)
Portfolio Ready: This project demonstrates advanced AI development skills that employers value highly.
π Ready to start building amazing AI-powered content systems? Let's create something incredible together!
This project is part of the CSTU Multi-Agent Systems course. Feel free to customize and extend it for your own projects!
Instructor: Ping Wu
Course: Multi-Agent Systems - November 2025
Institution: California Science and Technology University