TL;DR: AI coding assistants increase throughput but often degrade stability. Without codebase context, they generate code that works but violates team conventions and architectural rules. This MCP provides structured pattern data and recorded rationale so agents produce code that fits.
AI drastically increases Throughput (more code/hour) but often kills Stability (more bugs/rework).
| Pain Point | Evidence |
|---|---|
| "AI doesn't know my codebase" | 64.7% of developers cite lack of codebase context as top AI challenge (Stack Overflow 2024) |
| "Vibe coding" = Tech Debt | Code churn doubled, rework increased. AI writes "working" code that breaks architectural rules (GitClear 2024) |
| The "Mirror Problem" | Semantic search just finds similar code. If 80% of your code is legacy/deprecated, AI will copy it. The tool becomes a mirror reflecting your bad habits. |
| Trust gap | Only 29% of developers trust AI output. Teams spend more time reviewing AI code than writing it. |
| Tool Category | What They Do | The Gap |
|---|---|---|
| AGENTS.md / .cursorrules | Static instructions (Intent) | Can't handle migration states (e.g., "Use A for new, B for old"). Static = brittle. |
| Semantic Search (RAG) | Finds relevant text | Blind to quality. Can't distinguish "High Churn Hotspot" from "Stable Core". |
| Linters | Complain after coding | Don't guide during generation. |
This MCP provides active context - not raw data, but structured intelligence derived from actual codebase state.
- Frequency Detection: "97% use
inject(), 3% useconstructor." (Consensus) - Internal Library Support: "Use
@company/button, notp-button." (Wrapper Detection) - Golden Files: "Here is the best example of a Service, not just any example."
- Pattern Momentum: "Use
Signals(Rising), avoidBehaviorSubject(Declining)." - Health Context: "
⚠️ Careful,UserService.tsis a high-churn hotspot with circular dependencies. Add tests."
- AGENTS.md defines intent: "Use functional patterns."
- MCP provides evidence: "Here are the 5 most recent functional patterns actually used."
| Limitation | Mitigation |
|---|---|
| Pattern frequency ≠ pattern quality | Pattern Momentum (Rise/Fall trends) distinguishes adoption direction from raw count. |
| Stale index risk | Manual re-indexing required for now. |
| Framework coverage | Deep analysis for Angular. Generic analyzer covers 30+ languages. React/Vue specialized analyzers extensible. |
| File-level trend detection | Trend is based on file modification date, not line-by-line content. A recently modified file may still contain legacy patterns on specific lines. Future: AST-based line-level detection. |
- Context alone is dangerous: Giving AI "all the context" just confuses it or teaches it bad habits (Search Contamination).
- Decisions > Data: AI needs guidance ("Use X"), not just options ("Here is X and Y").
- Governance through Discovery: Blocking PRs is not required. If the AI sees that a pattern is "Declining" and "Dangerous," it self-corrects.
- Stack Overflow 2024 Developer Survey
- GitClear 2024 AI Code Quality Report (The "Churn" problem)
- DORA State of DevOps 2024 (Stability vs Throughput)
- Search Contamination: Without MCP, models copied legacy patterns 40% of the time.
- Momentum Success: With "Trending" signals, models adopted modern patterns even when they were the minority (3%).