Skip to content

Commit 71fb6b6

Browse files
authored
Merge pull request #1 from code-rabi/docs/add-demo-and-events-api
docs: add demo section and real-time events API
2 parents 2409bc5 + 0b26763 commit 71fb6b6

29 files changed

Lines changed: 4478 additions & 127 deletions

AGENTS.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -13,8 +13,7 @@ The V8 isolate provides these bindings to LLM-generated code:
1313
| `context` | The loaded context data |
1414
| `llm_query(prompt, model?)` | Query sub-LLM |
1515
| `llm_query_batched(prompts, model?)` | Batch query sub-LLMs |
16-
| `FINAL(answer)` | Return final answer |
17-
| `FINAL_VAR(varName)` | Return variable as final answer |
16+
| `giveFinalAnswer({ message, data? })` | Return final answer |
1817
| `print(...)` | Console output |
1918

2019
## Architecture
@@ -30,7 +29,7 @@ The V8 isolate provides these bindings to LLM-generated code:
3029
│ │ │ • llm_query() ──┐ │ │
3130
│ │ │ • llm_query_batched() │ │
3231
│ ▼ │ • print() / console │ │
33-
│ ┌─────────────┐ │ • FINAL() / FINAL_VAR() │ │
32+
│ ┌─────────────┐ │ • giveFinalAnswer() │ │
3433
│ │ LLMClient │◀───┼──────────────────┘ │ │
3534
│ │ (OpenAI) │ │ │ │
3635
│ └─────────────┘ │ LLM-generated JS code runs here │ │

README.md

Lines changed: 47 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,14 @@ pnpm add rllm
1717
npm install rllm
1818
```
1919

20+
## Demo
21+
22+
RLLM analyzing a `node_modules` directory — the LLM writes JavaScript to parse dependencies, query sub-LLMs in parallel, and synthesize a final answer:
23+
24+
![RLLM Demo](./RLLM.gif)
25+
26+
Built with Gemini Flash 3. See the full interactive example in [`examples/node-modules-viz/`](./examples/node-modules-viz/).
27+
2028
## Quick Start
2129

2230
LLM writes JavaScript code that runs in a secure V8 isolate:
@@ -88,7 +96,7 @@ const findings = await llm_query_batched(
8896

8997
const summary = await llm_query(`Combine findings:\n${findings.join('\n')}`);
9098
print(summary);
91-
FINAL(summary);
99+
giveFinalAnswer({ message: summary });
92100
```
93101

94102
## API Reference
@@ -100,7 +108,7 @@ Create an RLLM instance with sensible defaults.
100108
```typescript
101109
const rlm = createRLLM({
102110
model: 'gpt-4o-mini', // Model name
103-
provider: 'openai', // 'openai' | 'anthropic' | 'openrouter' | 'custom'
111+
provider: 'openai', // 'openai' | 'anthropic' | 'gemini' | 'openrouter' | 'custom'
104112
apiKey: process.env.KEY, // Optional, uses env vars by default
105113
baseUrl: undefined, // Optional, required for 'custom' provider
106114
verbose: true, // Enable logging
@@ -147,10 +155,44 @@ The V8 isolate provides these bindings to LLM-generated code:
147155
| `context` | The loaded context data |
148156
| `llm_query(prompt, model?)` | Query sub-LLM |
149157
| `llm_query_batched(prompts, model?)` | Batch query sub-LLMs |
150-
| `FINAL(answer)` | Return final answer |
151-
| `FINAL_VAR(varName)` | Return variable as final answer |
158+
| `giveFinalAnswer({ message, data? })` | Return final answer |
152159
| `print(...)` | Console output |
153160

161+
### Real-time Events
162+
163+
Subscribe to execution events for visualizations, debugging, or streaming UIs:
164+
165+
```typescript
166+
const result = await rlm.completion("Analyze this data", {
167+
context: myData,
168+
onEvent: (event) => {
169+
switch (event.type) {
170+
case "iteration_start":
171+
console.log(`Starting iteration ${event.iteration}`);
172+
break;
173+
case "llm_query_start":
174+
console.log("LLM thinking...");
175+
break;
176+
case "code_execution_start":
177+
console.log(`Executing:\n${event.code}`);
178+
break;
179+
case "final_answer":
180+
console.log(`Answer: ${event.answer}`);
181+
break;
182+
}
183+
}
184+
});
185+
```
186+
187+
| Event Type | Description |
188+
|------------|-------------|
189+
| `iteration_start` | New iteration beginning |
190+
| `llm_query_start` | Main LLM query starting |
191+
| `llm_query_end` | Main LLM response received |
192+
| `code_execution_start` | V8 isolate executing code |
193+
| `code_execution_end` | Code execution finished |
194+
| `final_answer` | `giveFinalAnswer()` called with answer |
195+
154196
## Architecture
155197

156198
```
@@ -164,7 +206,7 @@ The V8 isolate provides these bindings to LLM-generated code:
164206
│ │ │ • llm_query() ──┐ │ │
165207
│ │ │ • llm_query_batched() │ │
166208
│ ▼ │ • print() / console │ │
167-
│ ┌─────────────┐ │ • FINAL() / FINAL_VAR() │ │
209+
│ ┌─────────────┐ │ • giveFinalAnswer() │ │
168210
│ │ LLMClient │◀───┼──────────────────┘ │ │
169211
│ │ (OpenAI) │ │ │ │
170212
│ └─────────────┘ │ LLM-generated JS code runs here │ │

RLM.gif

52.6 MB
Loading
Lines changed: 264 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,264 @@
1+
# node_modules Graph Analyzer
2+
3+
Interactive 3D visualization of your `node_modules` dependencies with RLLM-powered queries. Watch in real-time as the AI traverses your dependency graph to answer questions about duplicates, sizes, and relationships.
4+
5+
## Features
6+
7+
- 📦 **Parse any node_modules** - Analyzes package.json files, calculates sizes, detects duplicates
8+
- 🎯 **Real-time Visualization** - 3D force-directed graph with live highlighting as RLLM explores
9+
- 🤖 **AI-Powered Queries** - Ask natural language questions about your dependencies
10+
- 🔍 **Proxy Hooks** - Tracks every property access via JavaScript Proxies for visualization
11+
12+
## Quick Start
13+
14+
```bash
15+
cd examples/node-modules-viz
16+
pnpm install
17+
pnpm start
18+
```
19+
20+
Then open http://localhost:3000 in your browser.
21+
22+
This runs both the backend (WebSocket + RLLM) and frontend (Vite + React) concurrently.
23+
24+
This will:
25+
1. Parse the parent directory's `node_modules` (this repo)
26+
2. Show the 3D visualization
27+
3. Let you click example queries or type your own
28+
29+
## Usage
30+
31+
### Analyze a specific project
32+
33+
```bash
34+
pnpm server --target=/path/to/your/project
35+
```
36+
37+
### Run a custom query from CLI
38+
39+
```bash
40+
pnpm server --query="Find the largest packages"
41+
```
42+
43+
Or just use the interactive UI in the browser!
44+
45+
### Combine both
46+
47+
```bash
48+
pnpm server --target=~/projects/my-app --query="What brings in lodash?"
49+
```
50+
51+
## Example Queries
52+
53+
### Find Duplicates
54+
```
55+
Which packages have multiple versions installed?
56+
```
57+
58+
### Dependency Chains
59+
```
60+
What brings in lodash? Show the dependency chain.
61+
```
62+
63+
### Disk Space Analysis
64+
```
65+
How much disk space could I save by deduping?
66+
```
67+
68+
### Size Analysis
69+
```
70+
Find the 5 largest packages and their total size.
71+
```
72+
73+
### Dependency Relationships
74+
```
75+
What's the path from react to scheduler?
76+
```
77+
78+
### Popularity Analysis
79+
```
80+
Which packages have the most dependents?
81+
```
82+
83+
### License Audit
84+
```
85+
Find all MIT licensed packages.
86+
```
87+
88+
### Circular Dependencies
89+
```
90+
Find circular dependencies in the graph.
91+
```
92+
93+
### Dev vs Prod
94+
```
95+
How many packages are dev dependencies vs production?
96+
```
97+
98+
### Version Conflicts
99+
```
100+
Show all packages where different versions are required by different dependents.
101+
```
102+
103+
## How It Works
104+
105+
### 1. Graph Parsing
106+
107+
The parser walks your `node_modules` directory recursively:
108+
- Reads each `package.json`
109+
- Calculates disk size
110+
- Detects hoisting and nesting
111+
- Identifies duplicate versions
112+
113+
### 2. Proxy Tracking
114+
115+
The graph context is wrapped in recursive JavaScript Proxies:
116+
117+
```typescript
118+
const trackedContext = createTrackedGraph(context, (event) => {
119+
// When RLLM code accesses context.packages["lodash@4.17.21"]
120+
// This callback fires and sends the event to the browser
121+
server.sendAccessEvent(event);
122+
});
123+
```
124+
125+
### 3. RLLM Execution
126+
127+
The AI writes JavaScript code that runs in a V8 isolate:
128+
129+
```javascript
130+
// Example: Find duplicates
131+
const duplicates = [];
132+
for (const [name, ids] of Object.entries(context.packagesByName)) {
133+
if (ids.length > 1) {
134+
const sizes = ids.map(id => context.packages[id].diskSize);
135+
duplicates.push({ name, versions: ids, totalSize: sizes.reduce((a,b) => a+b) });
136+
}
137+
}
138+
giveFinalAnswer({ message: JSON.stringify(duplicates, null, 2) });
139+
```
140+
141+
Every property access (like `context.packages[id]`) triggers the Proxy, which emits an event to highlight that node in the visualization.
142+
143+
### 4. Real-time Visualization
144+
145+
The browser receives WebSocket events and highlights nodes as they're accessed:
146+
- **Blue nodes** = regular packages
147+
- **Red nodes** = duplicate versions
148+
- **Green pulse** = currently accessed by RLLM
149+
150+
## Architecture
151+
152+
```
153+
┌─────────────────┐
154+
│ node_modules/ │
155+
└────────┬────────┘
156+
│ parse
157+
158+
┌─────────────────┐
159+
│ Dependency Graph│
160+
└────────┬────────┘
161+
│ wrap in Proxy
162+
163+
┌─────────────────┐ ┌──────────────┐
164+
│ Tracked Context │─────▶│ RLLM Engine │
165+
└────────┬────────┘ └──────┬───────┘
166+
│ │
167+
│ access events │ query result
168+
▼ ▼
169+
┌─────────────────────────────────────┐
170+
│ WebSocket Server │
171+
└────────────┬────────────────────────┘
172+
173+
174+
┌─────────────────────────────────────┐
175+
│ Browser (force-graph-3d) │
176+
│ - 3D visualization │
177+
│ - Real-time highlighting │
178+
│ - Stats panel │
179+
└─────────────────────────────────────┘
180+
```
181+
182+
## Graph Data Structure
183+
184+
### Nodes
185+
```typescript
186+
{
187+
id: "lodash@4.17.21",
188+
name: "lodash",
189+
version: "4.17.21",
190+
diskSize: 1234567,
191+
isDuplicate: false,
192+
isHoisted: true,
193+
dependents: ["package-a@1.0.0", "package-b@2.0.0"]
194+
}
195+
```
196+
197+
### Edges
198+
```typescript
199+
{
200+
source: "react@18.2.0",
201+
target: "scheduler@0.23.0",
202+
type: "prod",
203+
versionRange: "^0.23.0"
204+
}
205+
```
206+
207+
## Environment Variables
208+
209+
Create a `.env` file:
210+
211+
```bash
212+
OPENAI_API_KEY=your_key_here
213+
MODEL=gpt-4o-mini # or gpt-4o, gpt-4-turbo, etc.
214+
```
215+
216+
## Tips
217+
218+
- **Large projects**: Parsing may take a minute for huge `node_modules` (10k+ packages)
219+
- **Query complexity**: Start simple, let RLLM iterate to solve complex queries
220+
- **Visualization**: Click and drag nodes, scroll to zoom, the graph auto-rotates
221+
- **Results panel**: Shows at the bottom when query completes
222+
223+
## Troubleshooting
224+
225+
### No packages found
226+
Make sure the target directory has a `node_modules` folder:
227+
```bash
228+
ls /path/to/project/node_modules
229+
```
230+
231+
### Browser doesn't open
232+
Manually navigate to: http://localhost:3000
233+
234+
### WebSocket connection failed
235+
Check if port 3000 is available:
236+
```bash
237+
lsof -i :3000
238+
```
239+
240+
### RLLM query fails
241+
Check your OpenAI API key is set:
242+
```bash
243+
echo $OPENAI_API_KEY
244+
```
245+
246+
## Development
247+
248+
```bash
249+
# Watch mode (auto-restart on changes)
250+
pnpm dev
251+
252+
# Run with verbose logging
253+
pnpm start --query="Your query"
254+
```
255+
256+
## Credits
257+
258+
Built with:
259+
- [RLLM](https://github.com/code-rabi/rllm) - Recursive Language Models
260+
- [react-force-graph-3d](https://github.com/vasturiano/react-force-graph) - React 3D graph visualization
261+
- [Vite](https://vitejs.dev/) + [React](https://react.dev/) - Frontend framework
262+
- [ws](https://github.com/websockets/ws) - WebSocket server (port 4242)
263+
264+
Inspired by the [Recursive Language Models paper](https://arxiv.org/abs/2512.24601).
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
<!DOCTYPE html>
2+
<html lang="en">
3+
<head>
4+
<meta charset="UTF-8" />
5+
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
6+
<title>node_modules Graph Analyzer</title>
7+
</head>
8+
<body>
9+
<div id="root"></div>
10+
<script type="module" src="/src/main.tsx"></script>
11+
</body>
12+
</html>

0 commit comments

Comments
 (0)