This document describes the direction for implementing new CLI functionality in Apache Airflow.
Table of Contents
Airflow ships two CLIs:
- airflow (
airflow-core) — bundled with the core distribution. Hosts both legacy remote commands (being rewired internally) and admin/local commands that have no Public API equivalent. - airflowctl (
airflow-ctl) — a standalone CLI distributed separately that talks to a running Airflow instance exclusively through the Public (Core) API.
Following AIP-94 (tracked via GitHub Projects #570 and #571), CLI work follows two rules:
- New commands that are achievable via the Public API are added to
airflowctlonly. Adding the same command to theairflowCLI as well is discouraged — it duplicates maintenance surface without user benefit. - Existing
airflowCLI remote commands stay in place (so users keep runningairflow dags list,airflow pools get, …) but are rewired internally to call the Public API via theairflowctlHTTP client instead of accessing the metadata database directly.
Both rules enforce RBAC, remove direct database exposure for remote operations, and eliminate duplicate code paths. This builds on AIP-81, which introduced the distinction between local and remote commands.
| Scenario | Where it goes | Notes |
|---|---|---|
| New command, achievable via Public API | airflowctl only |
Do not add to the airflow CLI unless strongly needed in core |
| New command, not achievable via Public API | airflow CLI (admin/local) |
Admin/local commands only — see below |
Existing airflow CLI command, achievable via Public API |
airflow CLI → delegates to the airflowctl HTTP client |
Rewire; no direct DB access, no SQLAlchemy/session usage |
Existing airflow CLI command, not achievable via Public API |
airflow CLI, unchanged |
Stays as a pure airflow-core implementation |
"Not achievable via the Public API" means the operation has no API representation and is inherently admin/local in nature — database shell, schema migrations, process management, or deployment configuration that requires direct infrastructure access.
Add the command to airflowctl only. Do not also add it to the airflow CLI unless
there is a strong reason it must live in core (e.g., it is tightly coupled to a local process
or a deployment concern with no API representation).
Source location: airflow-ctl/src/airflowctl/ctl/commands/
HTTP client and operations: airflow-ctl/src/airflowctl/api/ (client.py, operations.py).
Add the command under the appropriate group module in
airflow-ctl/src/airflowctl/ctl/commands/.Call the Public API through the
airflowctlHTTP client (airflowctl.api.client) and the operations layer inairflowctl.api.operations. Do not importairflow-coremodels or touch the metadata database.If the required API endpoint does not exist yet, add it first (see Adding API Endpoints).
Add tests under
airflow-ctl/tests/.Run integration tests with:
breeze testing airflow-ctl-integration-test
Use this when an existing airflow CLI remote command still talks to the database
directly and needs to go through the Public API instead. The user-facing command name and
arguments stay the same.
Source location: airflow-core/src/airflow/cli/
- Replace direct database access with calls through the
airflowctlHTTP client (airflowctl.api.client/airflowctl.api.operations). Remove SQLAlchemy model imports andsession-based helpers from the command. - If the required API endpoint does not exist yet, add it first (see Adding API Endpoints).
- Update tests under
airflow-core/tests/cli/to mock or exercise the HTTP client instead of the database.
Use this only when the operation cannot reasonably be exposed through the Public API — typically database shell, schema migrations, process management, or deployment-time configuration.
Source location: airflow-core/src/airflow/cli/
- Add the command to an appropriate admin group (e.g.,
db,config). - Add
(admin only)to thehelpstring so users know the command requires direct infrastructure access. - Add tests under
airflow-core/tests/cli/.