Skip to content

Commit f11614f

Browse files
anandgupta42claude
andcommitted
refactor: restore /init, rename data stack setup to /discover
Restore the original /init command (creates AGENTS.md) and move the data stack setup functionality to /discover instead. Updates all docs to reference /discover as the recommended first-run command. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent f6e690e commit f11614f

7 files changed

Lines changed: 84 additions & 62 deletions

File tree

docs/docs/configure/commands.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,16 +2,17 @@
22

33
## Built-in Commands
44

5-
altimate-code ships with two built-in slash commands:
5+
altimate-code ships with three built-in slash commands:
66

77
| Command | Description |
88
|---------|-------------|
9-
| `/init` | Scan your data stack and set up warehouse connections. Detects dbt projects, warehouse connections from profiles/Docker/env vars, installed tools, and config files. Walks you through adding and testing new connections, then indexes schemas. |
9+
| `/init` | Create or update an AGENTS.md file with build commands and code style guidelines. |
10+
| `/discover` | Scan your data stack and set up warehouse connections. Detects dbt projects, warehouse connections from profiles/Docker/env vars, installed tools, and config files. Walks you through adding and testing new connections, then indexes schemas. |
1011
| `/review` | Review changes — accepts `commit`, `branch`, or `pr` as an argument (defaults to uncommitted changes). |
1112

12-
### `/init`
13+
### `/discover`
1314

14-
The recommended way to set up a new project. Run `/init` in the TUI and the agent will:
15+
The recommended way to set up a new data engineering project. Run `/discover` in the TUI and the agent will:
1516

1617
1. Call `project_scan` to detect your full environment
1718
2. Present what was found (dbt project, connections, tools, config files)

docs/docs/data-engineering/tools/warehouse-tools.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22

33
## project_scan
44

5-
Scan the entire data engineering environment in one call. Detects dbt projects, warehouse connections, Docker databases, installed tools, and configuration files. Used by the `/init` command.
5+
Scan the entire data engineering environment in one call. Detects dbt projects, warehouse connections, Docker databases, installed tools, and configuration files. Used by the `/discover` command.
66

77
```
8-
> /init
8+
> /discover
99
1010
# Environment Scan
1111

docs/docs/getting-started.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,13 +12,13 @@ npm install -g @altimateai/altimate-code
1212
altimate-code
1313
```
1414

15-
The TUI launches with an interactive terminal. On first run, use the `/init` command to auto-detect your data stack:
15+
The TUI launches with an interactive terminal. On first run, use the `/discover` command to auto-detect your data stack:
1616

1717
```
18-
/init
18+
/discover
1919
```
2020

21-
`/init` scans your environment and sets up everything automatically:
21+
`/discover` scans your environment and sets up everything automatically:
2222

2323
1. **Detects your dbt project** — finds `dbt_project.yml`, parses the manifest, and reads profiles
2424
2. **Discovers warehouse connections** — from `~/.dbt/profiles.yml`, running Docker containers, and environment variables (e.g. `SNOWFLAKE_ACCOUNT`, `PGHOST`, `DATABASE_URL`)

docs/docs/usage/tui.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ The TUI has three main areas:
2020
|--------|--------|---------|
2121
| `@` | Reference a file | `@src/models/user.sql explain this model` |
2222
| `!` | Run a shell command | `!dbt run --select my_model` |
23-
| `/` | Slash command | `/init`, `/connect`, `/review`, `/models`, `/theme` |
23+
| `/` | Slash command | `/discover`, `/connect`, `/review`, `/models`, `/theme` |
2424

2525
## Leader Key
2626

packages/altimate-code/src/command/index.ts

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ import { Config } from "../config/config"
44
import { Instance } from "../project/instance"
55
import { Identifier } from "../id/id"
66
import PROMPT_INITIALIZE from "./template/initialize.txt"
7+
import PROMPT_DISCOVER from "./template/discover.txt"
78
import PROMPT_REVIEW from "./template/review.txt"
89
import { MCP } from "../mcp"
910
import { Skill } from "../skill"
@@ -53,6 +54,7 @@ export namespace Command {
5354

5455
export const Default = {
5556
INIT: "init",
57+
DISCOVER: "discover",
5658
REVIEW: "review",
5759
} as const
5860

@@ -62,13 +64,22 @@ export namespace Command {
6264
const result: Record<string, Info> = {
6365
[Default.INIT]: {
6466
name: Default.INIT,
65-
description: "scan data stack and set up connections",
67+
description: "create/update AGENTS.md",
6668
source: "command",
6769
get template() {
6870
return PROMPT_INITIALIZE.replace("${path}", Instance.worktree)
6971
},
7072
hints: hints(PROMPT_INITIALIZE),
7173
},
74+
[Default.DISCOVER]: {
75+
name: Default.DISCOVER,
76+
description: "scan data stack and set up connections",
77+
source: "command",
78+
get template() {
79+
return PROMPT_DISCOVER
80+
},
81+
hints: hints(PROMPT_DISCOVER),
82+
},
7283
[Default.REVIEW]: {
7384
name: Default.REVIEW,
7485
description: "review changes [commit|branch|pr], defaults to uncommitted",
Lines changed: 55 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,55 @@
1+
You are setting up altimate-code for a data engineering project. Guide the user through environment detection and warehouse connection setup.
2+
3+
Step 1 — Scan the environment:
4+
Call the `project_scan` tool to detect the full data engineering environment. Present the results clearly to the user.
5+
6+
Step 2 — Review what was found:
7+
Summarize the scan results in a friendly way:
8+
- Git repository details
9+
- dbt project (name, profile, model/source/test counts)
10+
- Warehouse connections already configured
11+
- New connections discovered from dbt profiles, Docker containers, and environment variables
12+
- Schema cache status (which warehouses are indexed)
13+
- Installed data tools (dbt, sqlfluff, etc.)
14+
- Configuration files found
15+
16+
Step 3 — Set up new connections:
17+
For each NEW warehouse connection discovered (not already configured):
18+
- Present the connection details and ask the user if they want to add it
19+
- If yes, call `warehouse_add` with the detected configuration
20+
- Then call `warehouse_test` to verify connectivity
21+
- Report whether the connection succeeded or failed
22+
- If it failed, offer to let the user correct the configuration
23+
24+
Skip this step if there are no new connections to add.
25+
26+
Step 4 — Index schemas:
27+
If any warehouses are connected but not yet indexed in the schema cache:
28+
- Ask the user if they want to index schemas now (explain this enables autocomplete, search, and context-aware analysis)
29+
- If yes, call `schema_index` for each selected warehouse
30+
- Report the number of schemas, tables, and columns indexed
31+
32+
Skip this step if all connected warehouses are already indexed or if no warehouses are connected.
33+
34+
Step 5 — Show next steps:
35+
Present a summary of what was set up, then suggest what the user can do next:
36+
37+
**Available skills:**
38+
- `/cost-report` — Analyze warehouse spending and find optimization opportunities
39+
- `/dbt-docs` — Generate or improve dbt model documentation
40+
- `/generate-tests` — Auto-generate dbt tests for your models
41+
- `/sql-review` — Review SQL for correctness, performance, and best practices
42+
- `/migrate-sql` — Translate SQL between warehouse dialects
43+
44+
**Agent modes to explore:**
45+
- `analyst` — Deep-dive into data quality, lineage, and schema questions
46+
- `builder` — Generate SQL, dbt models, and data pipelines
47+
- `validator` — Validate SQL correctness and catch issues before they hit production
48+
- `migrator` — Plan and execute warehouse migrations
49+
50+
**Useful commands:**
51+
- `warehouse_list` — See all configured connections
52+
- `schema_search` — Find tables and columns across warehouses
53+
- `sql_execute` — Run queries against any connected warehouse
54+
55+
$ARGUMENTS
Lines changed: 6 additions & 51 deletions
Original file line numberDiff line numberDiff line change
@@ -1,55 +1,10 @@
1-
You are setting up altimate-code for a data engineering project. Guide the user through environment detection and warehouse connection setup.
1+
Please analyze this codebase and create an AGENTS.md file containing:
2+
1. Build/lint/test commands - especially for running a single test
3+
2. Code style guidelines including imports, formatting, types, naming conventions, error handling, etc.
24

3-
Step 1 — Scan the environment:
4-
Call the `project_scan` tool to detect the full data engineering environment. Present the results clearly to the user.
5+
The file you create will be given to agentic coding agents (such as yourself) that operate in this repository. Make it about 150 lines long.
6+
If there are Cursor rules (in .cursor/rules/ or .cursorrules) or Copilot rules (in .github/copilot-instructions.md), make sure to include them.
57

6-
Step 2 — Review what was found:
7-
Summarize the scan results in a friendly way:
8-
- Git repository details
9-
- dbt project (name, profile, model/source/test counts)
10-
- Warehouse connections already configured
11-
- New connections discovered from dbt profiles, Docker containers, and environment variables
12-
- Schema cache status (which warehouses are indexed)
13-
- Installed data tools (dbt, sqlfluff, etc.)
14-
- Configuration files found
15-
16-
Step 3 — Set up new connections:
17-
For each NEW warehouse connection discovered (not already configured):
18-
- Present the connection details and ask the user if they want to add it
19-
- If yes, call `warehouse_add` with the detected configuration
20-
- Then call `warehouse_test` to verify connectivity
21-
- Report whether the connection succeeded or failed
22-
- If it failed, offer to let the user correct the configuration
23-
24-
Skip this step if there are no new connections to add.
25-
26-
Step 4 — Index schemas:
27-
If any warehouses are connected but not yet indexed in the schema cache:
28-
- Ask the user if they want to index schemas now (explain this enables autocomplete, search, and context-aware analysis)
29-
- If yes, call `schema_index` for each selected warehouse
30-
- Report the number of schemas, tables, and columns indexed
31-
32-
Skip this step if all connected warehouses are already indexed or if no warehouses are connected.
33-
34-
Step 5 — Show next steps:
35-
Present a summary of what was set up, then suggest what the user can do next:
36-
37-
**Available skills:**
38-
- `/cost-report` — Analyze warehouse spending and find optimization opportunities
39-
- `/dbt-docs` — Generate or improve dbt model documentation
40-
- `/generate-tests` — Auto-generate dbt tests for your models
41-
- `/sql-review` — Review SQL for correctness, performance, and best practices
42-
- `/migrate-sql` — Translate SQL between warehouse dialects
43-
44-
**Agent modes to explore:**
45-
- `analyst` — Deep-dive into data quality, lineage, and schema questions
46-
- `builder` — Generate SQL, dbt models, and data pipelines
47-
- `validator` — Validate SQL correctness and catch issues before they hit production
48-
- `migrator` — Plan and execute warehouse migrations
49-
50-
**Useful commands:**
51-
- `warehouse_list` — See all configured connections
52-
- `schema_search` — Find tables and columns across warehouses
53-
- `sql_execute` — Run queries against any connected warehouse
8+
If there's already an AGENTS.md, improve it if it's located in ${path}
549

5510
$ARGUMENTS

0 commit comments

Comments
 (0)