Skip to content

Sohamactive/Devops-copilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DevOps Copilot

DevOps Copilot is an AI-powered assistant designed to analyze local codebases and provide specialized DevOps guidance.

Unlike a general-purpose chatbot, this tool uses multi-agent orchestration (via LangGraph) to route user requests to specific specialist agents based on the intent of the query and the actual structure of the provided code.

Core Functionality

The application allows a user to provide a path to a local codebase and ask for specific DevOps deliverables. The system then:

  • Analyzes the code: Uses an AST-based analyzer to scan the codebase and extract context.
  • Routes the request: A router determines which specialist agent is best suited for the task.
  • Generates output: The selected agent uses a Google Gemini model to produce a tailored result.

Specialist Agents

The repository implements five distinct agent roles:

  • 🐳 Dockerfile Agent: Generate or optimize Dockerfiles and container workflows.
  • 🧪 Test Case Agent: Inspect code paths and suggest test scenarios and coverage.
  • 📦 Bundle Size Agent: Review frontend/backend dependencies and suggest bundle-size improvements.
  • 🔒 Production Agent: Find production-readiness issues, runtime risks, and deployment gaps.
  • 💬 General Agent: Answer general DevOps questions about CI/CD, cloud, infrastructure, and deployment.

Architecture

Backend

  • FastAPI: HTTP API with /api/v1 routes and /analyze/stream SSE endpoint.
  • SQLite: Local chat and conversation persistence via backend/data/devops_chatbot.db.
  • LangGraph & LangChain: Orchestrates router and agents for intent-aware request handling.
  • Google Gemini: Cloud LLM runtime support via GOOGLE_API_KEY.
  • LM Studio / edge runtime: Supports local LM Studio edge model selection with EDGE_MODEL_NAME.
  • AST analyzer: backend/app/tools/code_analyzer.py parses Python AST and summarizes source trees for other languages.

Frontend

  • React 19 + TypeScript: Interactive chat UI.
  • Vite: Development server and production build.
  • Tailwind CSS v4 + DaisyUI: UI styling and components.
  • Proxy: Frontend dev server proxies /api requests to backend port 8000.

Setup

Backend

  1. cd backend
  2. pip install -e . or install requirements from pyproject.toml
  3. Create .env with values below.
  4. uv run main.py or python main.py

Frontend

  1. cd frontend
  2. npm install
  3. npm run dev

Configuration

Create backend/.env with:

GOOGLE_API_KEY=your-google-api-key
CLOUD_MODEL_NAME=gemini-2.0-flash
EDGE_MODEL_NAME=your-lmstudio-model-name
LMSTUDIO_BASE_URL=http://127.0.0.1:1234/v1
LMSTUDIO_API_KEY=lm-studio
SQLITE_DB_PATH=./data/devops_chatbot.db
  • GOOGLE_API_KEY: required for cloud model runtime.
  • EDGE_MODEL_NAME: selects LM Studio / edge model.
  • LMSTUDIO_BASE_URL / LMSTUDIO_API_KEY: LM Studio OpenAI-compatible host.
  • SQLITE_DB_PATH: local conversation store.

API highlights

  • GET /health — health check.
  • GET /api/v1/runtime/edge/status — edge runtime status.
  • POST /api/v1/analyze/stream — stream analysis results via SSE.
  • GET /api/v1/conversations — list saved chats.
  • POST /api/v1/conversations — create chat.
  • GET /api/v1/conversations/{id}/messages — load messages.

Supported analysis inputs

The backend analyzer can inspect:

  • source files: .py, .js, .ts, .jsx, .tsx, .go, .java, .rs, .rb, .php, .c, .cpp, .h, .hpp, .swift, .kt
  • dependency files: pyproject.toml, package.json, requirements.txt, go.mod, Cargo.toml, pom.xml, build.gradle, etc.
  • config files: Dockerfile, .dockerignore, .gitignore, tsconfig.json, vite.config.ts, and common app config files.

Project structure

  • backend/: backend API, agents, prompts, storage, and tooling.
    • backend/app/agents/: agent nodes and router logic.
    • backend/app/api/: API routes and SSE streaming.
    • backend/app/llm/: model runtime and edge status checks.
    • backend/app/prompts/: prompt library and skill persona definitions.
    • backend/app/tools/: codebase analyzer and file scanning.
    • backend/app/storage/: SQLite chat storage.
  • frontend/: React chat UI and build configuration.
    • frontend/src/: UI components, hooks, API utilities, and types.

Notes

  • Frontend dev mode proxies API requests to http://localhost:8000.
  • Backend logs startup warnings when cloud or edge runtime config is incomplete.
  • Chat history is stored locally, so restarting backend preserves conversations unless DB file removed.