Skip to content

Commit 611ac47

Browse files
committed
docs: replace quick-install references with ragcode-installer
1 parent 2d8e518 commit 611ac47

File tree

7 files changed

+152
-807
lines changed

7 files changed

+152
-807
lines changed

QUICKSTART.md

Lines changed: 45 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -17,57 +17,54 @@ RagCode is an MCP (Model Context Protocol) server that allows you to navigate an
1717

1818
## ⚡ Quick Install (1 Command)
1919

20-
### Option 1: Install Script (Recommended)
20+
### Option 1: `ragcode-installer` (Recommended)
2121

2222
```bash
23-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | bash
23+
curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$(uname -s | tr '[:upper:]' '[:lower:]') -o ragcode-installer \
24+
&& chmod +x ragcode-installer \
25+
&& ./ragcode-installer -ollama=docker -qdrant=docker
2426
```
2527

26-
The installer will:
27-
1. ✅ Download the latest release from GitHub (or build locally if download fails)
28-
2. ✅ Install binaries to `~/.local/share/ragcode/bin`
29-
3. ✅ Add `rag-code-mcp` to PATH (in `.bashrc` or `.zshrc`)
30-
4. ✅ Configures Windsurf, Cursor, Antigravity, and VS Code automatically (in `mcp_config.json`)
31-
5.**Starts Docker** (if not already running)
32-
6.**Starts Qdrant container** (vector database)
33-
7.**Starts Ollama** with `ollama serve` (if not already running)
34-
8.**Downloads required AI models**:
35-
- `nomic-embed-text` (~274 MB) - for embeddings
36-
- `phi3:medium` (~7.9 GB) - for LLM
37-
9. ✅ Starts MCP server in background
28+
On Windows (PowerShell):
29+
```powershell
30+
Invoke-WebRequest -Uri "https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-windows.exe" -OutFile "ragcode-installer.exe"
31+
./ragcode-installer.exe -ollama docker -qdrant docker
32+
```
3833

39-
**Environment Variables (Optional):**
34+
The installer is end-to-end:
35+
1. ✅ Installs the `rag-code-mcp` and `index-all` binaries into `~/.local/share/ragcode/bin`
36+
2. ✅ Configures all MCP-capable IDEs (Windsurf, Cursor, Claude, VS Code, etc.)
37+
3. ✅ Starts Docker when needed and launches the Ollama + Qdrant containers
38+
4. ✅ Downloads the required models (`phi3:medium` & `nomic-embed-text`)
39+
5. ✅ Runs a health-check and starts the MCP server
4040

41-
You can customize the installation by setting environment variables before running the script:
41+
**Helpful CLI flag combinations:**
4242

4343
```bash
44-
# Use development branch instead of main
45-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/develop/quick-install.sh | BRANCH=develop bash
46-
47-
# Custom Ollama model
48-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | OLLAMA_MODEL=llama3.1:8b bash
44+
# Everything in Docker (default)
45+
./ragcode-installer -ollama=docker -qdrant=docker
4946

50-
# Custom embedding model
51-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | OLLAMA_EMBED=all-minilm bash
47+
# Keep Ollama local, run only Qdrant in Docker
48+
./ragcode-installer -ollama=local -qdrant=docker
5249

53-
# Custom Ollama URL (if running remotely)
54-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | OLLAMA_BASE_URL=http://192.168.1.100:11434 bash
50+
# Use existing remote services
51+
./ragcode-installer -ollama=local -qdrant=remote --skip-build
5552

56-
# Custom Qdrant URL
57-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | QDRANT_URL=http://192.168.1.100:6333 bash
53+
# Mount a custom Ollama models directory for Docker
54+
./ragcode-installer -ollama=docker -models-dir=$HOME/.ollama
5855

59-
# Combine multiple variables
60-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/develop/quick-install.sh | BRANCH=develop OLLAMA_MODEL=phi3:mini bash
56+
# Enable GPU support for the Ollama container
57+
./ragcode-installer -ollama=docker -qdrant=docker -gpu
6158
```
6259

63-
**Available Environment Variables:**
64-
- `BRANCH` - Git branch to install from (default: `main`)
65-
- `OLLAMA_MODEL` - LLM model name (default: `phi3:medium`)
66-
- `OLLAMA_EMBED` - Embedding model (default: `nomic-embed-text`)
67-
- `OLLAMA_BASE_URL` - Ollama server URL (default: `http://localhost:11434`)
68-
- `QDRANT_URL` - Qdrant server URL (default: `http://localhost:6333`)
60+
Key flags:
61+
- `-ollama`: `docker` (default) or `local`
62+
- `-qdrant`: `docker` (default) or `remote`
63+
- `-models-dir`: host directory to mount inside the container
64+
- `-gpu`: passes `--gpus=all`
65+
- `-skip-build`: reuse existing binaries without rebuilding
6966

70-
### Option 2: Local Build (For Developers)
67+
### Option 2: Local Build (for developers)
7168

7269
```bash
7370
git clone https://github.com/doITmagic/rag-code-mcp.git
@@ -116,10 +113,12 @@ brew install ollama
116113
### 2. Run the Installer
117114

118115
```bash
119-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | bash
116+
curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$(uname -s | tr '[:upper:]' '[:lower:]') -o ragcode-installer
117+
chmod +x ragcode-installer
118+
./ragcode-installer -ollama=docker -qdrant=docker
120119
```
121120

122-
**Installation time:** ~5-10 minutes (downloads ~4GB of AI models)
121+
**Installation time:** 5-10 minutes (model downloads dominate)
123122

124123
### 3. Verify Installation
125124

@@ -132,10 +131,12 @@ docker ps | grep qdrant
132131
ollama list
133132
```
134133

135-
### 4. Start Server (Optional - starts automatically)
134+
### 4. Health Check & Services (installer already starts them)
136135

137136
```bash
138-
~/.local/share/ragcode/start.sh
137+
~/.local/share/ragcode/bin/rag-code-mcp --health
138+
docker ps | grep ragcode-qdrant
139+
docker ps | grep ragcode-ollama
139140
```
140141

141142
---
@@ -223,12 +224,12 @@ RagCode integrates with **GitHub Copilot's Agent Mode** in VS Code through the M
223224

224225
#### Prerequisites
225226
- **VS Code** with **GitHub Copilot** subscription
226-
- RagCode installed (via quick-install script above)
227+
- RagCode installed (via `ragcode-installer`)
227228
- VS Code version **1.95+** (for MCP support)
228229

229230
#### Setup
230231

231-
The quick-install script automatically configures RagCode for VS Code by creating:
232+
`ragcode-installer` configurează automat RagCode pentru VS Code și creează:
232233
```
233234
~/.config/Code/User/globalStorage/mcp-servers.json
234235
```
@@ -468,15 +469,12 @@ workspace:
468469

469470
### Error: "Could not connect to Qdrant"
470471

471-
**Cause:** Docker is not running or Qdrant is stopped.
472+
**Cause:** Docker is not running or Qdrant container is stopped.
472473

473474
**Solution:**
474475
```bash
475-
# Start Docker
476476
sudo systemctl start docker
477-
478-
# Start Qdrant
479-
~/.local/share/ragcode/start.sh
477+
docker ps | grep ragcode-qdrant || docker start ragcode-qdrant
480478
```
481479

482480
### Error: "Ollama model not found"

README.md

Lines changed: 29 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -175,15 +175,17 @@ Without RagCode, AI assistants must:
175175

176176
### One-Command Installation
177177

178-
**Linux / macOS:**
178+
**Linux / macOS (Docker by default for Ollama + Qdrant):**
179179
```bash
180-
curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$(uname -s | tr '[:upper:]' '[:lower:]') -o ragcode-installer && chmod +x ragcode-installer && ./ragcode-installer
180+
curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$(uname -s | tr '[:upper:]' '[:lower:]') -o ragcode-installer \
181+
&& chmod +x ragcode-installer \
182+
&& ./ragcode-installer -ollama=docker -qdrant=docker
181183
```
182184

183185
**Windows (PowerShell):**
184186
```powershell
185187
Invoke-WebRequest -Uri "https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-windows.exe" -OutFile "ragcode-installer.exe"
186-
.\ragcode-installer.exe
188+
./ragcode-installer.exe -ollama docker -qdrant docker
187189
```
188190

189191
### What the installer does:
@@ -206,28 +208,31 @@ Once installed, **you don't need to configure anything**.
206208

207209
### Installation Options
208210

209-
The installer supports flexible configuration via flags:
211+
The installer runs **both Ollama and Qdrant inside Docker by default**. Popular scenarios:
210212

211213
```bash
212-
# Use Docker for both Ollama and Qdrant (recommended for isolation)
214+
# Recommended (everything inside Docker)
213215
./ragcode-installer -ollama=docker -qdrant=docker
214216

215-
# Use local Ollama with Docker Qdrant (if you already have Ollama installed)
217+
# Use an existing local Ollama but keep Qdrant in Docker
216218
./ragcode-installer -ollama=local -qdrant=docker
217219

218-
# Enable GPU support in Docker containers
219-
./ragcode-installer -ollama=docker -gpu
220+
# Point to remote services you already manage
221+
./ragcode-installer -ollama=local -qdrant=remote --skip-build
220222

221-
# Custom models directory for Docker volume mapping
222-
./ragcode-installer -ollama=docker -models-dir=/path/to/models
223+
# Enable GPU acceleration for the Ollama container
224+
./ragcode-installer -ollama=docker -qdrant=docker -gpu
225+
226+
# Mount a custom directory with Ollama models when running in Docker
227+
./ragcode-installer -ollama=docker -models-dir=$HOME/.ollama
223228
```
224229

225-
**Available Flags:**
226-
- `-ollama`: `local` (default) or `docker`
230+
**Key flags:**
231+
- `-ollama`: `docker` (default) or `local`
227232
- `-qdrant`: `docker` (default) or `remote`
228-
- `-models-dir`: Custom path for Ollama models (for Docker mapping)
229-
- `-gpu`: Enable NVIDIA GPU support in containers
230-
- `-skip-build`: Skip binary installation (use existing)
233+
- `-models-dir`: host path to mount as `/root/.ollama`
234+
- `-gpu`: passes `--gpus=all` to the Ollama container
235+
- `-skip-build`: reuse existing binaries instead of rebuilding
231236

232237
See [QUICKSTART.md](./QUICKSTART.md) for detailed installation and usage instructions.
233238

@@ -267,10 +272,12 @@ brew install ollama
267272

268273
### 2. Run the Installer
269274
```bash
270-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | bash
275+
curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$(uname -s | tr '[:upper:]' '[:lower:]') -o ragcode-installer
276+
chmod +x ragcode-installer
277+
./ragcode-installer -ollama=docker -qdrant=docker
271278
```
272279

273-
Installation typically takes 5‑10minutes (downloading the AI models can be the longest part).
280+
Installation takes 5‑10 minutes (downloading the Ollama models is the long pole).
274281

275282
### 3. Verify Installation
276283
```bash
@@ -282,9 +289,11 @@ docker ps | grep qdrant
282289
ollama list
283290
```
284291

285-
### 4. Start the Server (optional – the installer already starts it)
292+
### 4. Health check (services start automatically)
286293
```bash
287-
~/.local/share/ragcode/start.sh
294+
~/.local/share/ragcode/bin/rag-code-mcp --health
295+
docker ps | grep ragcode-qdrant
296+
docker ps | grep ragcode-ollama
288297
```
289298

290299
---
@@ -306,7 +315,7 @@ After installation, RagCode is automatically available in supported IDEs. No add
306315
RagCode integrates with **GitHub Copilot's Agent Mode** through MCP, enabling semantic code search as part of Copilot's autonomous workflow.
307316

308317
**Quick Setup:**
309-
1. Install RagCode using the quick-install script (automatically configures VS Code)
318+
1. Install RagCode with `ragcode-installer` (it configures VS Code automatically)
310319
2. Open VS Code in your project
311320
3. Open Copilot Chat (Ctrl+Shift+I / Cmd+Shift+I)
312321
4. Enable **Agent Mode** (click "Agent" button or type `/agent`)

docs/vscode-copilot-integration.md

Lines changed: 7 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -30,8 +30,8 @@ Before setting up RagCode with VS Code + Copilot, ensure you have:
3030
- Copilot extension installed and activated
3131

3232
3. **RagCode Installed**
33-
- Use the quick-install script (recommended)
34-
- Or build from source
33+
- Rulează `ragcode-installer` din ultima versiune GitHub (recomandat)
34+
- Sau construiește din sursă cu `go run ./cmd/install`
3535

3636
4. **Required Services Running**
3737
- Docker (for Qdrant vector database)
@@ -43,10 +43,12 @@ Before setting up RagCode with VS Code + Copilot, ensure you have:
4343

4444
### Automatic Setup (Recommended)
4545

46-
The RagCode quick-install script automatically configures VS Code:
46+
`ragcode-installer` configurează automat VS Code și creează intrarea MCP:
4747

4848
```bash
49-
curl -fsSL https://raw.githubusercontent.com/doITmagic/rag-code-mcp/main/quick-install.sh | bash
49+
curl -L https://github.com/doITmagic/rag-code-mcp/releases/latest/download/ragcode-installer-$(uname -s | tr '[:upper:]' '[:lower:]') -o ragcode-installer \
50+
&& chmod +x ragcode-installer \
51+
&& ./ragcode-installer -ollama=docker -qdrant=docker
5052
```
5153

5254
This creates the MCP configuration file at:
@@ -301,11 +303,7 @@ tail -f /tmp/ragcode-mcp.log
301303

302304
2. Check Qdrant is running:
303305
```bash
304-
docker ps | grep qdrant
305-
```
306-
If not running:
307-
```bash
308-
~/.local/share/ragcode/start.sh
306+
docker ps | grep ragcode-qdrant || docker start ragcode-qdrant
309307
```
310308

311309
3. Check Ollama is running:

0 commit comments

Comments
 (0)