You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+12-6Lines changed: 12 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,14 @@
1
-
# RexOS
1
+
# LoopForge
2
2
3
3
English | [简体中文](README.zh-CN.md)
4
4
5
-
RexOS is a long-running agent operating system: persistent memory, tool sandboxing, and model routing, plus an Anthropic-style harness for multi-session work.
5
+
LoopForge (formerly RexOS) is a long-running agent operating system: persistent memory, tool sandboxing, and model routing, plus an Anthropic-style harness for multi-session work.
6
+
7
+
## Brand update
8
+
9
+
- Public product name: **LoopForge**
10
+
- Compatibility names still in use: `rexos` (CLI), `~/.rexos` (config/data dir), and `rexleimo/rexos` (repo path)
11
+
- Existing scripts/docs using `rexos` continue to work
model = "default"# uses providers.<name>.default_model
98
104
```
99
105
100
-
To switch providers, set the provider's `api_key_env` (if needed) and update `[router.*]` to point at the provider you want. If you keep `model = "default"`, RexOS uses `providers.<name>.default_model`.
106
+
To switch providers, set the provider's `api_key_env` (if needed) and update `[router.*]` to point at the provider you want. If you keep `model = "default"`, LoopForge uses `providers.<name>.default_model`.
@@ -19,6 +19,8 @@ Develop locally with small models on Ollama, then switch routing to GLM / MiniMa
19
19
20
20
</div>
21
21
22
+
> Brand update: LoopForge is the public name (formerly RexOS). Compatibility remains unchanged for now: CLI is still `rexos`, and config/data path is still `~/.rexos`.
23
+
22
24
<divclass="grid cards"markdown>
23
25
24
26
- :material-checklist: **Harness-first long tasks**
@@ -48,7 +50,7 @@ Make sure you have at least one **chat model** available in Ollama (not embeddin
0 commit comments