Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions docs/docs/configure/commands.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,37 @@
# Commands

## Built-in Commands

altimate-code ships with three built-in slash commands:

| Command | Description |
|---------|-------------|
| `/init` | Create or update an AGENTS.md file with build commands and code style guidelines. |
| `/discover` | Scan your data stack and set up warehouse connections. Detects dbt projects, warehouse connections from profiles/Docker/env vars, installed tools, and config files. Walks you through adding and testing new connections, then indexes schemas. |
| `/review` | Review changes — accepts `commit`, `branch`, or `pr` as an argument (defaults to uncommitted changes). |

### `/discover`

The recommended way to set up a new data engineering project. Run `/discover` in the TUI and the agent will:

1. Call `project_scan` to detect your full environment
2. Present what was found (dbt project, connections, tools, config files)
3. Offer to add each new connection discovered (from dbt profiles, Docker, environment variables)
4. Test each connection with `warehouse_test`
5. Offer to index schemas for autocomplete and context-aware analysis
6. Show available skills and agent modes

### `/review`

```
/review # review uncommitted changes
/review commit # review the last commit
/review branch # review all changes on the current branch
/review pr # review the current pull request
```

## Custom Commands

Custom commands let you define reusable slash commands.

## Creating Commands
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/data-engineering/tools/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,6 @@ altimate-code has 55+ specialized tools organized by function.
| [FinOps Tools](finops-tools.md) | 8 tools | Cost analysis, warehouse sizing, unused resources, RBAC |
| [Lineage Tools](lineage-tools.md) | 1 tool | Column-level lineage tracing with confidence scoring |
| [dbt Tools](dbt-tools.md) | 2 tools + 6 skills | Run, manifest parsing, test generation, scaffolding |
| [Warehouse Tools](warehouse-tools.md) | 2 tools | Connection management and testing |
| [Warehouse Tools](warehouse-tools.md) | 6 tools | Environment scanning, connection management, discovery, testing |

All tools are available in the interactive TUI. The agent automatically selects the right tools based on your request.
124 changes: 124 additions & 0 deletions docs/docs/data-engineering/tools/warehouse-tools.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,89 @@
# Warehouse Tools

## project_scan

Scan the entire data engineering environment in one call. Detects dbt projects, warehouse connections, Docker databases, installed tools, and configuration files. Used by the `/discover` command.

```
> /discover

# Environment Scan

## Python Engine
✓ Engine healthy

## Git Repository
✓ Git repo on branch `main` (origin: github.com/org/analytics)

## dbt Project
✓ Project "analytics" (profile: snowflake_prod)
Models: 47, Sources: 12, Tests: 89
✓ packages.yml found

## Warehouse Connections

### Already Configured
Name | Type | Database
prod-snowflake | snowflake | ANALYTICS

### From dbt profiles.yml
Name | Type | Source
dbt_snowflake_dev | snowflake | dbt-profile

### From Docker
Container | Type | Host:Port
local-postgres | postgres | localhost:5432

### From Environment Variables
Name | Type | Signal
env_bigquery | bigquery | GOOGLE_APPLICATION_CREDENTIALS

## Installed Data Tools
✓ dbt v1.8.4
✓ sqlfluff v3.1.0
✗ airflow (not found)

## Config Files
✓ .altimate-code/altimate-code.json
✓ .sqlfluff
✗ .pre-commit-config.yaml (not found)
```

### What it detects

| Category | Detection method |
|----------|-----------------|
| **Git** | `git` commands (branch, remote) |
| **dbt project** | Walks up directories for `dbt_project.yml`, reads name/profile |
| **dbt manifest** | Parses `target/manifest.json` for model/source/test counts |
| **dbt profiles** | Bridge call to parse `~/.dbt/profiles.yml` |
| **Docker DBs** | Bridge call to discover running PostgreSQL/MySQL/MSSQL containers |
| **Existing connections** | Bridge call to list already-configured warehouses |
| **Environment variables** | Scans `process.env` for warehouse signals (see table below) |
| **Schema cache** | Bridge call for indexed warehouse status |
| **Data tools** | Spawns `tool --version` for 9 common tools |
| **Config files** | Checks for `.altimate-code/`, `.sqlfluff`, `.pre-commit-config.yaml` |

### Environment variable detection

| Warehouse | Signal (any one triggers detection) |
|-----------|-------------------------------------|
| Snowflake | `SNOWFLAKE_ACCOUNT` |
| BigQuery | `GOOGLE_APPLICATION_CREDENTIALS`, `BIGQUERY_PROJECT`, `GCP_PROJECT` |
| Databricks | `DATABRICKS_HOST`, `DATABRICKS_SERVER_HOSTNAME` |
| PostgreSQL | `PGHOST`, `PGDATABASE`, `DATABASE_URL` |
| MySQL | `MYSQL_HOST`, `MYSQL_DATABASE` |
| Redshift | `REDSHIFT_HOST` |

### Parameters

| Parameter | Type | Description |
|-----------|------|-------------|
| `skip_docker` | boolean | Skip Docker container discovery (faster) |
| `skip_tools` | boolean | Skip installed tool detection (faster) |

---

## warehouse_list

List all configured warehouse connections.
Expand Down Expand Up @@ -54,3 +138,43 @@ Testing connection to bigquery-prod (bigquery)...
| `Object does not exist` | Wrong database/schema | Verify database name in config |
| `Role not authorized` | Insufficient privileges | Use a role with USAGE on warehouse |
| `Timeout` | Network latency | Increase connection timeout |

---

## warehouse_add

Add a new warehouse connection by providing a name and configuration.

```
> warehouse_add my-postgres {"type": "postgres", "host": "localhost", "port": 5432, "database": "analytics", "user": "analyst", "password": "secret"}

✓ Added warehouse 'my-postgres' (postgres)
```

---

## warehouse_remove

Remove an existing warehouse connection.

```
> warehouse_remove my-postgres

✓ Removed warehouse 'my-postgres'
```

---

## warehouse_discover

Discover database containers running in Docker. Detects PostgreSQL, MySQL/MariaDB, and SQL Server containers with their connection details.

```
> warehouse_discover

Container | Type | Host:Port | User | Database | Status
local-postgres | postgres | localhost:5432 | postgres | postgres | running
mysql-dev | mysql | localhost:3306 | root | mydb | running

Use warehouse_add to save any of these as a connection.
```
19 changes: 16 additions & 3 deletions docs/docs/getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,23 @@ npm install -g @altimateai/altimate-code
altimate-code
```

The TUI launches with an interactive terminal. On first run, use the `/connect` command to configure:
The TUI launches with an interactive terminal. On first run, use the `/discover` command to auto-detect your data stack:

1. **LLM provider** — Choose your AI backend (Anthropic, OpenAI, Codex, etc.)
2. **Warehouse connection** — Connect to your data warehouse
```
/discover
```

`/discover` scans your environment and sets up everything automatically:

1. **Detects your dbt project** — finds `dbt_project.yml`, parses the manifest, and reads profiles
2. **Discovers warehouse connections** — from `~/.dbt/profiles.yml`, running Docker containers, and environment variables (e.g. `SNOWFLAKE_ACCOUNT`, `PGHOST`, `DATABASE_URL`)
3. **Checks installed tools** — dbt, sqlfluff, airflow, dagster, prefect, soda, sqlmesh, great_expectations, sqlfmt
4. **Offers to configure connections** — walks you through adding and testing each discovered warehouse
5. **Indexes schemas** — populates the schema cache for autocomplete and context-aware analysis

You can also configure connections manually — see [Warehouse connections](#warehouse-connections) below.

To set up your LLM provider, use the `/connect` command.

## Configuration

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/usage/tui.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ The TUI has three main areas:
|--------|--------|---------|
| `@` | Reference a file | `@src/models/user.sql explain this model` |
| `!` | Run a shell command | `!dbt run --select my_model` |
| `/` | Slash command | `/connect`, `/models`, `/theme` |
| `/` | Slash command | `/discover`, `/connect`, `/review`, `/models`, `/theme` |

## Leader Key

Expand Down
11 changes: 11 additions & 0 deletions packages/altimate-code/src/command/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ import { Config } from "../config/config"
import { Instance } from "../project/instance"
import { Identifier } from "../id/id"
import PROMPT_INITIALIZE from "./template/initialize.txt"
import PROMPT_DISCOVER from "./template/discover.txt"
import PROMPT_REVIEW from "./template/review.txt"
import { MCP } from "../mcp"
import { Skill } from "../skill"
Expand Down Expand Up @@ -53,6 +54,7 @@ export namespace Command {

export const Default = {
INIT: "init",
DISCOVER: "discover",
REVIEW: "review",
} as const

Expand All @@ -69,6 +71,15 @@ export namespace Command {
},
hints: hints(PROMPT_INITIALIZE),
},
[Default.DISCOVER]: {
name: Default.DISCOVER,
description: "scan data stack and set up connections",
source: "command",
get template() {
return PROMPT_DISCOVER
},
hints: hints(PROMPT_DISCOVER),
},
[Default.REVIEW]: {
name: Default.REVIEW,
description: "review changes [commit|branch|pr], defaults to uncommitted",
Expand Down
55 changes: 55 additions & 0 deletions packages/altimate-code/src/command/template/discover.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
You are setting up altimate-code for a data engineering project. Guide the user through environment detection and warehouse connection setup.

Step 1 — Scan the environment:
Call the `project_scan` tool to detect the full data engineering environment. Present the results clearly to the user.

Step 2 — Review what was found:
Summarize the scan results in a friendly way:
- Git repository details
- dbt project (name, profile, model/source/test counts)
- Warehouse connections already configured
- New connections discovered from dbt profiles, Docker containers, and environment variables
- Schema cache status (which warehouses are indexed)
- Installed data tools (dbt, sqlfluff, etc.)
- Configuration files found

Step 3 — Set up new connections:
For each NEW warehouse connection discovered (not already configured):
- Present the connection details and ask the user if they want to add it
- If yes, call `warehouse_add` with the detected configuration
- Then call `warehouse_test` to verify connectivity
- Report whether the connection succeeded or failed
- If it failed, offer to let the user correct the configuration

Skip this step if there are no new connections to add.

Step 4 — Index schemas:
If any warehouses are connected but not yet indexed in the schema cache:
- Ask the user if they want to index schemas now (explain this enables autocomplete, search, and context-aware analysis)
- If yes, call `schema_index` for each selected warehouse
- Report the number of schemas, tables, and columns indexed

Skip this step if all connected warehouses are already indexed or if no warehouses are connected.

Step 5 — Show next steps:
Present a summary of what was set up, then suggest what the user can do next:

**Available skills:**
- `/cost-report` — Analyze warehouse spending and find optimization opportunities
- `/dbt-docs` — Generate or improve dbt model documentation
- `/generate-tests` — Auto-generate dbt tests for your models
- `/sql-review` — Review SQL for correctness, performance, and best practices
- `/migrate-sql` — Translate SQL between warehouse dialects

**Agent modes to explore:**
- `analyst` — Deep-dive into data quality, lineage, and schema questions
- `builder` — Generate SQL, dbt models, and data pipelines
- `validator` — Validate SQL correctness and catch issues before they hit production
- `migrator` — Plan and execute warehouse migrations

**Useful commands:**
- `warehouse_list` — See all configured connections
- `schema_search` — Find tables and columns across warehouses
- `sql_execute` — Run queries against any connected warehouse

$ARGUMENTS
Loading