@@ -17,57 +17,54 @@ RagCode is an MCP (Model Context Protocol) server that allows you to navigate an
1717
1818## ⚡ Quick Install (1 Command)
1919
20- ### Option 1: Install Script (Recommended)
20+ ### Option 1: ` ragcode-installer ` (Recommended)
2121
2222``` bash
23- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | bash
23+ curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$( uname -s | tr ' [:upper:]' ' [:lower:]' ) -o ragcode-installer \
24+ && chmod +x ragcode-installer \
25+ && ./ragcode-installer -ollama=docker -qdrant=docker
2426```
2527
26- The installer will:
27- 1 . ✅ Download the latest release from GitHub (or build locally if download fails)
28- 2 . ✅ Install binaries to ` ~/.local/share/ragcode/bin `
29- 3 . ✅ Add ` rag-code-mcp ` to PATH (in ` .bashrc ` or ` .zshrc ` )
30- 4 . ✅ Configures Windsurf, Cursor, Antigravity, and VS Code automatically (in ` mcp_config.json ` )
31- 5 . ✅ ** Starts Docker** (if not already running)
32- 6 . ✅ ** Starts Qdrant container** (vector database)
33- 7 . ✅ ** Starts Ollama** with ` ollama serve ` (if not already running)
34- 8 . ✅ ** Downloads required AI models** :
35- - ` nomic-embed-text ` (~ 274 MB) - for embeddings
36- - ` phi3:medium ` (~ 7.9 GB) - for LLM
37- 9 . ✅ Starts MCP server in background
28+ On Windows (PowerShell):
29+ ``` powershell
30+ Invoke-WebRequest -Uri "https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-windows.exe" -OutFile "ragcode-installer.exe"
31+ ./ragcode-installer.exe -ollama docker -qdrant docker
32+ ```
3833
39- ** Environment Variables (Optional):**
34+ The installer is end-to-end:
35+ 1 . ✅ Installs the ` rag-code-mcp ` and ` index-all ` binaries into ` ~/.local/share/ragcode/bin `
36+ 2 . ✅ Configures all MCP-capable IDEs (Windsurf, Cursor, Claude, VS Code, etc.)
37+ 3 . ✅ Starts Docker when needed and launches the Ollama + Qdrant containers
38+ 4 . ✅ Downloads the required models (` phi3:medium ` & ` nomic-embed-text ` )
39+ 5 . ✅ Runs a health-check and starts the MCP server
4040
41- You can customize the installation by setting environment variables before running the script:
41+ ** Helpful CLI flag combinations: **
4242
4343``` bash
44- # Use development branch instead of main
45- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/develop/quick-install.sh | BRANCH=develop bash
46-
47- # Custom Ollama model
48- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | OLLAMA_MODEL=llama3.1:8b bash
44+ # Everything in Docker (default)
45+ ./ragcode-installer -ollama=docker -qdrant=docker
4946
50- # Custom embedding model
51- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | OLLAMA_EMBED=all-minilm bash
47+ # Keep Ollama local, run only Qdrant in Docker
48+ ./ragcode-installer -ollama=local -qdrant=docker
5249
53- # Custom Ollama URL (if running remotely)
54- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | OLLAMA_BASE_URL=http://192.168.1.100:11434 bash
50+ # Use existing remote services
51+ ./ragcode-installer -ollama=local -qdrant=remote --skip-build
5552
56- # Custom Qdrant URL
57- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | QDRANT_URL=http://192.168.1.100:6333 bash
53+ # Mount a custom Ollama models directory for Docker
54+ ./ragcode-installer -ollama=docker -models-dir= $HOME /.ollama
5855
59- # Combine multiple variables
60- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/develop/quick-install.sh | BRANCH=develop OLLAMA_MODEL=phi3:mini bash
56+ # Enable GPU support for the Ollama container
57+ ./ragcode-installer -ollama=docker -qdrant=docker -gpu
6158```
6259
63- ** Available Environment Variables: **
64- - ` BRANCH ` - Git branch to install from (default: ` main ` )
65- - ` OLLAMA_MODEL ` - LLM model name (default: ` phi3:medium ` )
66- - ` OLLAMA_EMBED ` - Embedding model (default: ` nomic-embed-text ` )
67- - ` OLLAMA_BASE_URL ` - Ollama server URL (default: ` http://localhost:11434 ` )
68- - ` QDRANT_URL ` - Qdrant server URL (default: ` http://localhost:6333 ` )
60+ Key flags:
61+ - ` -ollama ` : ` docker ` (default) or ` local `
62+ - ` -qdrant ` : ` docker ` (default) or ` remote `
63+ - ` -models-dir ` : host directory to mount inside the container
64+ - ` -gpu ` : passes ` --gpus=all `
65+ - ` -skip-build ` : reuse existing binaries without rebuilding
6966
70- ### Option 2: Local Build (For Developers )
67+ ### Option 2: Local Build (for developers )
7168
7269``` bash
7370git clone https://github.com/doITmagic/rag-code-mcp.git
@@ -116,10 +113,12 @@ brew install ollama
116113### 2. Run the Installer
117114
118115``` bash
119- curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | bash
116+ curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$( uname -s | tr ' [:upper:]' ' [:lower:]' ) -o ragcode-installer
117+ chmod +x ragcode-installer
118+ ./ragcode-installer -ollama=docker -qdrant=docker
120119```
121120
122- ** Installation time:** ~ 5-10 minutes (downloads ~ 4GB of AI models )
121+ ** Installation time:** 5-10 minutes (model downloads dominate )
123122
124123### 3. Verify Installation
125124
@@ -132,10 +131,12 @@ docker ps | grep qdrant
132131ollama list
133132```
134133
135- ### 4. Start Server (Optional - starts automatically )
134+ ### 4. Health Check & Services (installer already starts them )
136135
137136``` bash
138- ~ /.local/share/ragcode/start.sh
137+ ~ /.local/share/ragcode/bin/rag-code-mcp --health
138+ docker ps | grep ragcode-qdrant
139+ docker ps | grep ragcode-ollama
139140```
140141
141142---
@@ -223,12 +224,12 @@ RagCode integrates with **GitHub Copilot's Agent Mode** in VS Code through the M
223224
224225#### Prerequisites
225226- ** VS Code** with ** GitHub Copilot** subscription
226- - RagCode installed (via quick-install script above )
227+ - RagCode installed (via ` ragcode-installer ` )
227228- VS Code version ** 1.95+** (for MCP support)
228229
229230#### Setup
230231
231- The quick-install script automatically configures RagCode for VS Code by creating :
232+ ` ragcode-installer ` configurează automat RagCode pentru VS Code și creează :
232233```
233234~/.config/Code/User/globalStorage/mcp-servers.json
234235```
@@ -468,15 +469,12 @@ workspace:
468469
469470# ## Error: "Could not connect to Qdrant"
470471
471- **Cause:** Docker is not running or Qdrant is stopped.
472+ **Cause:** Docker is not running or Qdrant container is stopped.
472473
473474**Solution:**
474475` ` ` bash
475- # Start Docker
476476sudo systemctl start docker
477-
478- # Start Qdrant
479- ~/.local/share/ragcode/start.sh
477+ docker ps | grep ragcode-qdrant || docker start ragcode-qdrant
480478` ` `
481479
482480# ## Error: "Ollama model not found"
0 commit comments