Skip to content

Commit 6ca91ec

Browse files
Kabuki94claude
andcommitted
feat(ai): switch default model to qwen2.5-coder:7b + nomic-embed-text
Replaces deepseek-coder-v2:lite (too large for 8GB target) with qwen2.5-coder:7b (4.7GB GGUF, ~5.5GB runtime, Apache 2.0, best-in-class DevOps/container/SELinux coding benchmark). Adds nomic-embed-text (274MB) as the embedded vector model. Updates env.defaults, profile.toml, and the OpenAI-compatible models catalog. Also adds Windows 11 one-liner install entry to README.md. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
1 parent f271a43 commit 6ca91ec

5 files changed

Lines changed: 27 additions & 6 deletions

File tree

README.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,18 @@ directly.
2727

2828
## Install
2929

30+
**Windows 11** (Podman Desktop + WSL2):
31+
32+
```powershell
33+
irm https://raw.githubusercontent.com/mios-dev/mios-bootstrap/main/install.ps1 | iex
34+
```
35+
36+
Installs as a Windows application (`%LOCALAPPDATA%\Programs\MiOS\`), clones both repos,
37+
registers in Add/Remove Programs, creates Start Menu shortcuts, and auto-configures WSL2.
38+
Requires [Git](https://git-scm.com/download/win), [Podman Desktop](https://podman-desktop.io), and WSL2.
39+
40+
**Linux** (Fedora bootc):
41+
3042
```bash
3143
sudo bash -c "$(curl -fsSL https://raw.githubusercontent.com/mios-dev/mios-bootstrap/main/install.sh)"
3244
```

automation/37-ollama-prep.sh

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ if [ -d "/var/lib/ollama/models" ] && [ "$(ls -A /var/lib/ollama/models)" ]; the
1414
exit 0
1515
fi
1616

17-
log "Downloading default model: deepseek-coder-v2:lite..."
17+
log "Downloading default models: qwen2.5-coder:7b + nomic-embed-text..."
1818

1919
# Install temporary ollama binary from GitHub releases (.tar.zst archive)
2020
# Standalone binary is no longer provided.
@@ -82,8 +82,9 @@ while ! scurl -s http://localhost:11434/api/tags > /dev/null; do
8282
fi
8383
done
8484

85-
# Pull the model
86-
/usr/bin/ollama pull deepseek-coder-v2:lite
85+
# Pull inference model (8GB-tier default) and embedding model
86+
/usr/bin/ollama pull qwen2.5-coder:7b
87+
/usr/bin/ollama pull nomic-embed-text
8788

8889
# Shutdown server
8990
kill $OLLAMA_PID

usr/share/mios/ai/v1/models.json

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,13 @@
22
"object": "list",
33
"data": [
44
{
5-
"id": "mi-os-7b",
5+
"id": "qwen2.5-coder:7b",
6+
"object": "model",
7+
"created": 1777536280,
8+
"owned_by": "mios-system"
9+
},
10+
{
11+
"id": "nomic-embed-text",
612
"object": "model",
713
"created": 1777536280,
814
"owned_by": "mios-system"

usr/share/mios/env.defaults

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,8 @@ MIOS_DEFAULT_HOST="mios"
1616

1717
# Inference (LAW 5: localhost OpenAI-compatible only).
1818
MIOS_AI_ENDPOINT="http://localhost:8080/v1"
19-
MIOS_AI_MODEL="default"
19+
MIOS_AI_MODEL="qwen2.5-coder:7b"
20+
MIOS_AI_EMBED_MODEL="nomic-embed-text"
2021

2122
# Image / branch metadata.
2223
MIOS_REPO_URL="https://github.com/mios-dev/mios"

usr/share/mios/profile.toml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,8 @@ allow_libvirt_bridge = true
4747

4848
[ai]
4949
endpoint = "http://localhost:8080/v1"
50-
model = "default"
50+
model = "qwen2.5-coder:7b"
51+
embed_model = "nomic-embed-text"
5152
api_key = ""
5253
enable_ollama = true
5354
enable_localai = true

0 commit comments

Comments
 (0)