Solutions for common issues with RagCode MCP server.
| Problem | Quick Solution |
|---|---|
| "Could not connect to Qdrant" | docker start ragcode-qdrant |
| "Ollama model not found" | ollama pull phi3:medium && ollama pull mxbai-embed-large |
| IDE doesn't see RagCode | Re-run ./ragcode-installer -skip-build |
| Indexing stuck | Check logs: tail -f ~/.local/share/ragcode/bin/mcp.log |
Cause: The file_path parameter is missing or points outside a recognized project.
Solution: Provide a valid file_path inside your project:
{
"query": "search query",
"file_path": "/path/to/your/project/file.go"
}Why this happens: RagCode uses file_path to detect which workspace you're working in. Without it, it defaults to /home which is not a valid project.
Cause: Docker is not running or the Qdrant container is stopped.
Solution:
# Start Docker (Linux)
sudo systemctl start docker
# Start Qdrant container
docker start ragcode-qdrant
# Or restart everything
~/.local/share/ragcode/start.shVerify:
docker ps | grep qdrant
# Should show: ragcode-qdrant ... Up ...Cause: Required AI models have not been downloaded.
Solution:
# Download embedding model
ollama pull mxbai-embed-large
# Download LLM model
ollama pull phi3:medium
# Verify
ollama listIf using Docker Ollama:
docker exec ragcode-ollama ollama pull mxbai-embed-large
docker exec ragcode-ollama ollama pull phi3:mediumCauses:
- Large workspace with many files
- Heavy LLM model
- Insufficient RAM/CPU
Solutions:
-
Use a smaller model:
# In config.yaml llm: model: "phi3:mini" # Instead of phi3:medium
-
Exclude large directories:
workspace: exclude_patterns: - "vendor" - "node_modules" - ".git" - "dist" - "build" - "*.min.js" - "*.bundle.js"
-
Wait for background indexing:
- Indexing runs in the background
- First query may be slow, subsequent queries are fast
- Check progress:
tail -f ~/.local/share/ragcode/bin/mcp.log
Cause: MCP configuration file is missing or incorrect.
Solution 1: Re-run installer
~/.local/share/ragcode/bin/ragcode-installer -skip-build -ollama=local -qdrant=dockerSolution 2: Manual configuration
Check your IDE's config file exists and has correct content:
| IDE | Config Path |
|---|---|
| Windsurf | ~/.codeium/windsurf/mcp_config.json |
| Cursor | ~/.cursor/mcp.config.json |
| VS Code | ~/.config/Code/User/globalStorage/mcp-servers.json |
| Claude Desktop | ~/.config/Claude/mcp-servers.json |
See IDE-SETUP.md for complete configuration examples.
Cause: Ollama embedding model is not responding correctly.
Solution:
# Restart Ollama
docker restart ragcode-ollama
# or
systemctl restart ollama
# Test embedding model
curl http://localhost:11434/api/embeddings -d '{
"model": "mxbai-embed-large",
"prompt": "test"
}'Cause: Large models or multiple workspaces indexed.
Solutions:
-
Use smaller models:
phi3:miniinstead ofphi3:mediumall-minilminstead ofmxbai-embed-large
-
Limit concurrent operations:
- Index one workspace at a time
- Close unused IDE windows
-
Increase swap (Linux):
sudo fallocate -l 8G /swapfile sudo chmod 600 /swapfile sudo mkswap /swapfile sudo swapon /swapfile
Cause: User not in docker group.
Solution (Linux):
sudo usermod -aG docker $USER
# Log out and log back inSolution (macOS):
- Ensure Docker Desktop is running
- Check Docker Desktop settings → Resources
Causes:
- Workspace not indexed yet
- Wrong language filter
- Query too specific
Solutions:
-
Check if indexed:
# Look for collection in Qdrant curl http://localhost:6333/collections | jq
-
Force re-index:
- Use
index_workspacetool withfile_pathparameter
- Use
-
Try broader query:
- Instead of exact function name, describe what it does
- Use
search_codebeforehybrid_search
Problem: Windows IDE can't find WSL binary.
Solution: Use wsl.exe wrapper in config:
{
"mcpServers": {
"ragcode": {
"command": "wsl.exe",
"args": ["-e", "/home/YOUR_USERNAME/.local/share/ragcode/bin/rag-code-mcp"]
}
}
}Problem: "Cannot connect to Docker daemon"
Solution:
- Start Docker Desktop application
- Wait for it to fully initialize (check system tray icon)
- Run installer again
# Check RagCode version
~/.local/share/ragcode/bin/rag-code-mcp --version
# Health check
~/.local/share/ragcode/bin/rag-code-mcp --health
# Check Docker containers
docker ps | grep ragcode
# Check Ollama models
ollama list
# Check Qdrant collections
curl http://localhost:6333/collections | jq
# View logs
tail -100 ~/.local/share/ragcode/bin/mcp.log
# Test Ollama connection
curl http://localhost:11434/api/tags
# Test Qdrant connection
curl http://localhost:6333/collectionsIf your issue isn't listed here:
- Check logs:
tail -f ~/.local/share/ragcode/bin/mcp.log - Search issues: GitHub Issues
- Open new issue: Include logs, OS, and steps to reproduce
- Quick Start - Installation guide
- Configuration - Settings and options
- IDE Setup - Manual IDE configuration