diff --git a/docs/reference/integrations.md b/docs/reference/integrations.md index dcb9a2b354..6933a6a0b2 100644 --- a/docs/reference/integrations.md +++ b/docs/reference/integrations.md @@ -42,6 +42,8 @@ specify integration list ``` Shows all available integrations, which one is currently installed, and whether each requires a CLI tool or is IDE-based. +When multiple integrations are installed, the list marks the default integration separately from the other installed integrations. +The list also shows whether each built-in integration is declared multi-install safe. ## Install an Integration @@ -52,9 +54,12 @@ specify integration install | Option | Description | | ------------------------ | ------------------------------------------------------------------------ | | `--script sh\|ps` | Script type: `sh` (bash/zsh) or `ps` (PowerShell) | +| `--force` | Opt in to installing alongside integrations that are not declared multi-install safe | | `--integration-options` | Integration-specific options (e.g. `--integration-options="--commands-dir .myagent/cmds"`) | -Installs the specified integration into the current project. Fails if another integration is already installed — use `switch` instead. If the installation fails partway through, it automatically rolls back to a clean state. +Installs the specified integration into the current project. If another integration is already installed, the command only proceeds automatically when all involved integrations are declared multi-install safe. Otherwise, use `switch` to replace the default integration or pass `--force` to explicitly opt in to multi-install. If the installation fails partway through, it automatically rolls back to a clean state. + +Installing an additional integration does not change the default integration. Use `specify integration use ` to change the default. > **Note:** All integration management commands require a project already initialized with `specify init`. To start a new project with a specific agent, use `specify init --integration ` instead. @@ -83,10 +88,22 @@ specify integration switch | Option | Description | | ------------------------ | ------------------------------------------------------------------------ | | `--script sh\|ps` | Script type: `sh` (bash/zsh) or `ps` (PowerShell) | -| `--force` | Force removal of modified files during uninstall | -| `--integration-options` | Options for the target integration | +| `--force` | Force removal of modified files during uninstall; when the target is already installed, overwrite managed shared templates while changing the default | +| `--integration-options` | Options for the target integration when it is not already installed | + +If the target integration is not already installed, equivalent to running `uninstall` followed by `install` in a single step. In this mode, `--force` controls whether modified files from the removed integration are deleted. If the target integration is already installed, `switch` only changes the default integration, like `use`; in this mode, `--force` controls whether managed shared templates are overwritten while the default changes. `--integration-options` is rejected for already-installed targets because changing integration options requires reinstalling managed files; run `upgrade --integration-options ...` first, then `use `. + +## Use an Installed Integration + +```bash +specify integration use +``` -Equivalent to running `uninstall` followed by `install` in a single step. +| Option | Description | +| --------- | --------------------------------------------------- | +| `--force` | Overwrite managed shared templates while changing the default | + +Sets the default integration without uninstalling any other installed integrations. This also refreshes managed shared templates so command references match the new default integration's invocation style. Modified or untracked shared templates are preserved unless `--force` is used. ## Upgrade an Integration @@ -100,7 +117,7 @@ specify integration upgrade [] | `--script sh\|ps` | Script type: `sh` (bash/zsh) or `ps` (PowerShell) | | `--integration-options` | Options for the integration | -Reinstalls the current integration with updated templates and commands (e.g., after upgrading Spec Kit). Defaults to the currently installed integration; if a key is provided, it must match the installed one — otherwise the command fails and suggests using `switch` instead. Detects locally modified files and blocks the upgrade unless `--force` is used. Stale files from the previous install that are no longer needed are removed automatically. +Reinstalls an installed integration with updated templates and commands (e.g., after upgrading Spec Kit). Defaults to the default integration; if a key is provided, it must be one of the installed integrations. Detects locally modified files and blocks the upgrade unless `--force` is used. Stale files from the previous install that are no longer needed are removed automatically. Shared templates stay aligned with the default integration even when upgrading a non-default integration. ## Integration-Specific Options @@ -119,9 +136,39 @@ specify integration install generic --integration-options="--commands-dir .myage ## FAQ -### Can I use multiple integrations at the same time? +### Can I install multiple integrations in the same project? + +Yes, but it is intended for team portability rather than the default workflow. Multiple integrations are allowed automatically only when the installed integration and the new integration are declared multi-install safe by Spec Kit. For other combinations, pass `--force` to acknowledge that multiple agents may see unrelated agent-specific instructions or commands. + +Spec Kit tracks one default integration in `.specify/integration.json` with `default_integration`, all installed integrations with `installed_integrations`, per-integration runtime settings with `integration_settings`, and a dedicated `integration_state_schema` for future state migrations. The legacy `integration` field remains as an alias for the default integration. + +### Which integrations are multi-install safe? + +An integration is multi-install safe when it uses isolated agent directories, a dedicated context file that does not collide with another safe integration, stable command invocation settings, and a separate install manifest. Shared Spec Kit templates remain aligned to the single default integration. + +The currently declared multi-install safe integrations are: + +| Key | Isolation | +| --- | --------- | +| `auggie` | `.augment/commands`, `.augment/rules/specify-rules.md` | +| `claude` | `.claude/skills`, `CLAUDE.md` | +| `codebuddy` | `.codebuddy/commands`, `CODEBUDDY.md` | +| `codex` | `.agents/skills`, `AGENTS.md` | +| `cursor-agent` | `.cursor/skills`, `.cursor/rules/specify-rules.mdc` | +| `gemini` | `.gemini/commands`, `GEMINI.md` | +| `iflow` | `.iflow/commands`, `IFLOW.md` | +| `junie` | `.junie/commands`, `.junie/AGENTS.md` | +| `kilocode` | `.kilocode/workflows`, `.kilocode/rules/specify-rules.md` | +| `kimi` | `.kimi/skills`, `KIMI.md` | +| `qodercli` | `.qoder/commands`, `QODER.md` | +| `qwen` | `.qwen/commands`, `QWEN.md` | +| `roo` | `.roo/commands`, `.roo/rules/specify-rules.md` | +| `shai` | `.shai/commands`, `SHAI.md` | +| `tabnine` | `.tabnine/agent/commands`, `TABNINE.md` | +| `trae` | `.trae/skills`, `.trae/rules/project_rules.md` | +| `windsurf` | `.windsurf/workflows`, `.windsurf/rules/specify-rules.md` | -No. Only one AI coding agent integration can be installed per project. Use `specify integration switch ` to change to a different AI coding agent. +Integrations that share a context file or command directory with another integration, require dynamic install paths such as `--commands-dir`, or merge shared tool settings are not declared safe by default. They can still be installed alongside another integration with `--force`. ### What happens to my changes when I uninstall or switch? @@ -137,4 +184,4 @@ CLI-based integrations (like Claude Code, Gemini CLI) require the tool to be ins ### When should I use `upgrade` vs `switch`? -Use `upgrade` when you've upgraded Spec Kit and want to refresh the same integration's templates. Use `switch` when you want to change to a different AI coding agent. +Use `upgrade` when you've upgraded Spec Kit and want to refresh an installed integration's managed files. Use `switch` when you want to replace the current default with another integration; if the target is already installed, `switch` behaves like `use`. diff --git a/src/specify_cli/__init__.py b/src/specify_cli/__init__.py index d5f5aba2d5..b3918b2a78 100644 --- a/src/specify_cli/__init__.py +++ b/src/specify_cli/__init__.py @@ -54,6 +54,26 @@ from rich.tree import Tree from typer.core import TyperGroup +from .integration_runtime import ( + invoke_separator_for_integration as _invoke_separator_for_integration, + resolve_integration_options as _resolve_integration_options_impl, + with_integration_setting as _with_integration_setting, +) +from .integration_state import ( + INTEGRATION_JSON, + dedupe_integration_keys as _dedupe_integration_keys, + default_integration_key as _default_integration_key, + installed_integration_keys as _installed_integration_keys, + integration_setting as _integration_setting, + integration_settings as _integration_settings, + normalize_integration_state as _normalize_integration_state, + write_integration_json as _write_integration_json_file, +) +from .shared_infra import ( + install_shared_infra as _install_shared_infra_impl, + refresh_shared_templates as _refresh_shared_templates_impl, +) + # For cross-platform keyboard input import readchar @@ -643,6 +663,11 @@ def _locate_core_pack() -> Path | None: return None +def _repo_root() -> Path: + """Return the source checkout root used for editable installs.""" + return Path(__file__).parent.parent.parent + + def _locate_bundled_extension(extension_id: str) -> Path | None: """Return the path to a bundled extension, or None. @@ -660,8 +685,7 @@ def _locate_bundled_extension(extension_id: str) -> Path | None: return candidate # Source-checkout / editable install: look relative to repo root - repo_root = Path(__file__).parent.parent.parent - candidate = repo_root / "extensions" / extension_id + candidate = _repo_root() / "extensions" / extension_id if (candidate / "extension.yml").is_file(): return candidate @@ -685,8 +709,7 @@ def _locate_bundled_workflow(workflow_id: str) -> Path | None: return candidate # Source-checkout / editable install: look relative to repo root - repo_root = Path(__file__).parent.parent.parent - candidate = repo_root / "workflows" / workflow_id + candidate = _repo_root() / "workflows" / workflow_id if (candidate / "workflow.yml").is_file(): return candidate @@ -710,14 +733,31 @@ def _locate_bundled_preset(preset_id: str) -> Path | None: return candidate # Source-checkout / editable install: look relative to repo root - repo_root = Path(__file__).parent.parent.parent - candidate = repo_root / "presets" / preset_id + candidate = _repo_root() / "presets" / preset_id if (candidate / "preset.yml").is_file(): return candidate return None +def _refresh_shared_templates( + project_path: Path, + *, + invoke_separator: str, + force: bool = False, +) -> None: + """Refresh default-sensitive shared templates without touching scripts.""" + _refresh_shared_templates_impl( + project_path, + version=get_speckit_version(), + core_pack=_locate_core_pack(), + repo_root=_repo_root(), + console=console, + invoke_separator=invoke_separator, + force=force, + ) + + def _install_shared_infra( project_path: Path, script_type: str, @@ -741,79 +781,16 @@ def _install_shared_infra( Returns ``True`` on success. """ - from .integrations.base import IntegrationBase - from .integrations.manifest import IntegrationManifest - - core = _locate_core_pack() - manifest = IntegrationManifest("speckit", project_path, version=get_speckit_version()) - - # Scripts - if core and (core / "scripts").is_dir(): - scripts_src = core / "scripts" - else: - repo_root = Path(__file__).parent.parent.parent - scripts_src = repo_root / "scripts" - - skipped_files: list[str] = [] - - if scripts_src.is_dir(): - dest_scripts = project_path / ".specify" / "scripts" - dest_scripts.mkdir(parents=True, exist_ok=True) - variant_dir = "bash" if script_type == "sh" else "powershell" - variant_src = scripts_src / variant_dir - if variant_src.is_dir(): - dest_variant = dest_scripts / variant_dir - dest_variant.mkdir(parents=True, exist_ok=True) - for src_path in variant_src.rglob("*"): - if src_path.is_file(): - rel_path = src_path.relative_to(variant_src) - dst_path = dest_variant / rel_path - if dst_path.exists() and not force: - skipped_files.append(str(dst_path.relative_to(project_path))) - else: - dst_path.parent.mkdir(parents=True, exist_ok=True) - shutil.copy2(src_path, dst_path) - rel = dst_path.relative_to(project_path).as_posix() - manifest.record_existing(rel) - - # Page templates (not command templates, not vscode-settings.json) - if core and (core / "templates").is_dir(): - templates_src = core / "templates" - else: - repo_root = Path(__file__).parent.parent.parent - templates_src = repo_root / "templates" - - if templates_src.is_dir(): - dest_templates = project_path / ".specify" / "templates" - dest_templates.mkdir(parents=True, exist_ok=True) - for f in templates_src.iterdir(): - if f.is_file() and f.name != "vscode-settings.json" and not f.name.startswith("."): - dst = dest_templates / f.name - if dst.exists() and not force: - skipped_files.append(str(dst.relative_to(project_path))) - else: - content = f.read_text(encoding="utf-8") - content = IntegrationBase.resolve_command_refs( - content, invoke_separator - ) - dst.write_text(content, encoding="utf-8") - rel = dst.relative_to(project_path).as_posix() - manifest.record_existing(rel) - - if skipped_files: - console.print( - f"[yellow]⚠[/yellow] {len(skipped_files)} shared infrastructure file(s) already exist and were not updated:" - ) - for f in skipped_files: - console.print(f" {f}") - console.print( - "To refresh shared infrastructure, run " - "[cyan]specify init --here --force[/cyan] or " - "[cyan]specify integration upgrade --force[/cyan]." - ) - - manifest.save() - return True + return _install_shared_infra_impl( + project_path, + script_type, + version=get_speckit_version(), + core_pack=_locate_core_pack(), + repo_root=_repo_root(), + console=console, + force=force, + invoke_separator=invoke_separator, + ) def ensure_executable_scripts(project_path: Path, tracker: StepTracker | None = None) -> None: @@ -1299,13 +1276,20 @@ def init( ) manifest.save() - # Write .specify/integration.json - integration_json = project_path / ".specify" / "integration.json" - integration_json.parent.mkdir(parents=True, exist_ok=True) - integration_json.write_text(json.dumps({ - "integration": resolved_integration.key, - "version": get_speckit_version(), - }, indent=2) + "\n", encoding="utf-8") + integration_settings = _with_integration_setting( + {}, + resolved_integration.key, + resolved_integration, + script_type=selected_script, + raw_options=integration_options, + parsed_options=integration_parsed_options or None, + ) + _write_integration_json( + project_path, + resolved_integration.key, + [resolved_integration.key], + integration_settings, + ) tracker.complete("integration", resolved_integration.config.get("name", resolved_integration.key)) @@ -1865,7 +1849,7 @@ def get_speckit_version() -> str: # Fallback: try reading from pyproject.toml try: import tomllib - pyproject_path = Path(__file__).parent.parent.parent / "pyproject.toml" + pyproject_path = _repo_root() / "pyproject.toml" if pyproject_path.exists(): with open(pyproject_path, "rb") as f: data = tomllib.load(f) @@ -1887,11 +1871,8 @@ def get_speckit_version() -> str: app.add_typer(integration_app, name="integration") -INTEGRATION_JSON = ".specify/integration.json" - - def _read_integration_json(project_root: Path) -> dict[str, Any]: - """Load ``.specify/integration.json``. Returns ``{}`` when missing.""" + """Load ``.specify/integration.json``. Returns normalized state when present.""" path = project_root / INTEGRATION_JSON if not path.exists(): return {} @@ -1911,20 +1892,34 @@ def _read_integration_json(project_root: Path) -> dict[str, Any]: console.print(f"[red]Error:[/red] {path} must contain a JSON object, got {type(data).__name__}.") console.print(f"Please fix or delete {INTEGRATION_JSON} and retry.") raise typer.Exit(1) - return data + return _normalize_integration_state(data) def _write_integration_json( project_root: Path, - integration_key: str, + integration_key: str | None, + installed_integrations: list[str] | None = None, + integration_settings: dict[str, dict[str, Any]] | None = None, ) -> None: - """Write ``.specify/integration.json`` for *integration_key*.""" - dest = project_root / INTEGRATION_JSON - dest.parent.mkdir(parents=True, exist_ok=True) - dest.write_text(json.dumps({ - "integration": integration_key, - "version": get_speckit_version(), - }, indent=2) + "\n", encoding="utf-8") + """Write ``.specify/integration.json`` with legacy-compatible state.""" + _write_integration_json_file( + project_root, + version=get_speckit_version(), + integration_key=integration_key, + installed_integrations=installed_integrations, + settings=integration_settings, + ) + + +def _clear_init_options_for_integration(project_root: Path, integration_key: str) -> None: + """Clear active integration keys from init-options.json when they match.""" + opts = load_init_options(project_root) + if opts.get("integration") == integration_key or opts.get("ai") == integration_key: + opts.pop("integration", None) + opts.pop("ai", None) + opts.pop("ai_skills", None) + opts.pop("context_file", None) + save_init_options(project_root, opts) def _remove_integration_json(project_root: Path) -> None: @@ -1934,6 +1929,9 @@ def _remove_integration_json(project_root: Path) -> None: path.unlink() +_MANIFEST_READ_ERRORS = (ValueError, FileNotFoundError, OSError, UnicodeDecodeError) + + def _normalize_script_type(script_type: str, source: str) -> str: """Normalize and validate a script type from CLI/config sources.""" normalized = script_type.strip().lower() @@ -1957,6 +1955,75 @@ def _resolve_script_type(project_root: Path, script_type: str | None) -> str: return "ps" if os.name == "nt" else "sh" +def _resolve_integration_script_type( + project_root: Path, + state: dict[str, Any], + key: str, + script_type: str | None = None, +) -> str: + """Resolve script type for an integration, preferring stored settings.""" + if script_type: + return _normalize_script_type(script_type, "--script") + + stored = _integration_setting(state, key).get("script") + if isinstance(stored, str) and stored.strip(): + return _normalize_script_type(stored, f"{INTEGRATION_JSON} integration_settings.{key}.script") + + return _resolve_script_type(project_root, None) + + +def _resolve_integration_options( + integration: Any, + state: dict[str, Any], + key: str, + raw_options: str | None, +) -> tuple[str | None, dict[str, Any] | None]: + """Resolve raw and parsed options for an integration operation.""" + return _resolve_integration_options_impl( + integration, + state, + key, + raw_options, + parse_options=_parse_integration_options, + ) + + +def _set_default_integration( + project_root: Path, + state: dict[str, Any], + key: str, + integration: Any, + installed_keys: list[str], + *, + script_type: str | None = None, + raw_options: str | None = None, + parsed_options: dict[str, Any] | None = None, + refresh_templates: bool = True, + refresh_templates_force: bool = False, +) -> None: + """Persist *key* as default and align active runtime metadata.""" + resolved_script = _resolve_integration_script_type(project_root, state, key, script_type) + settings = _with_integration_setting( + state, + key, + integration, + script_type=resolved_script, + raw_options=raw_options, + parsed_options=parsed_options, + ) + _write_integration_json(project_root, key, installed_keys, settings) + _update_init_options_for_integration(project_root, integration, script_type=resolved_script) + + if refresh_templates: + _refresh_shared_templates( + project_root, + invoke_separator=_invoke_separator_for_integration( + integration, {"integration_settings": settings}, key, parsed_options + ), + force=refresh_templates_force, + ) + + @integration_app.command("list") def integration_list( catalog: bool = typer.Option(False, "--catalog", help="Browse full catalog (built-in + community)"), @@ -1973,7 +2040,8 @@ def integration_list( raise typer.Exit(1) current = _read_integration_json(project_root) - installed_key = current.get("integration") + default_key = _default_integration_key(current) + installed_keys = set(_installed_integration_keys(current)) if catalog: from .integrations.catalog import IntegrationCatalog, IntegrationCatalogError @@ -1995,12 +2063,15 @@ def integration_list( table.add_column("Version") table.add_column("Source") table.add_column("Status") + table.add_column("Multi-install Safe") for entry in sorted(entries, key=lambda e: e["id"]): eid = entry["id"] cat_name = entry.get("_catalog_name", "") install_allowed = entry.get("_install_allowed", True) - if eid == installed_key: + if eid == default_key: + status = "[green]installed (default)[/green]" + elif eid in installed_keys: status = "[green]installed[/green]" elif eid in INTEGRATION_REGISTRY: status = "built-in" @@ -2008,12 +2079,16 @@ def integration_list( status = "discovery-only" else: status = "" + safe = "" + if eid in INTEGRATION_REGISTRY: + safe = "yes" if getattr(INTEGRATION_REGISTRY[eid], "multi_install_safe", False) else "no" table.add_row( eid, entry.get("name", eid), entry.get("version", ""), cat_name, status, + safe, ) console.print(table) @@ -2024,6 +2099,7 @@ def integration_list( table.add_column("Name") table.add_column("Status") table.add_column("CLI Required") + table.add_column("Multi-install Safe") for key in sorted(INTEGRATION_REGISTRY.keys()): integration = INTEGRATION_REGISTRY[key] @@ -2031,18 +2107,22 @@ def integration_list( name = cfg.get("name", key) requires_cli = cfg.get("requires_cli", False) - if key == installed_key: + if key == default_key: + status = "[green]installed (default)[/green]" + elif key in installed_keys: status = "[green]installed[/green]" else: status = "" cli_req = "yes" if requires_cli else "no (IDE)" - table.add_row(key, name, status, cli_req) + safe = "yes" if getattr(integration, "multi_install_safe", False) else "no" + table.add_row(key, name, status, cli_req, safe) console.print(table) - if installed_key: - console.print(f"\n[dim]Current integration:[/dim] [cyan]{installed_key}[/cyan]") + if installed_keys: + console.print(f"\n[dim]Default integration:[/dim] [cyan]{default_key or 'none'}[/cyan]") + console.print(f"[dim]Installed integrations:[/dim] [cyan]{', '.join(sorted(installed_keys))}[/cyan]") else: console.print("\n[yellow]No integration currently installed.[/yellow]") console.print("Install one with: [cyan]specify integration install [/cyan]") @@ -2052,6 +2132,7 @@ def integration_list( def integration_install( key: str = typer.Argument(help="Integration key to install (e.g. claude, copilot)"), script: str | None = typer.Option(None, "--script", help="Script type: sh or ps (default: from init-options.json or platform default)"), + force: bool = typer.Option(False, "--force", help="Allow multi-install when integrations are not declared safe"), integration_options: str | None = typer.Option(None, "--integration-options", help='Options for the integration (e.g. --integration-options="--commands-dir .myagent/cmds")'), ): """Install an integration into an existing project.""" @@ -2074,30 +2155,68 @@ def integration_install( raise typer.Exit(1) current = _read_integration_json(project_root) - installed_key = current.get("integration") + default_key = _default_integration_key(current) + installed_keys = _installed_integration_keys(current) - if installed_key and installed_key == key: + if key in installed_keys: console.print(f"[yellow]Integration '{key}' is already installed.[/yellow]") - console.print("Run [cyan]specify integration uninstall[/cyan] first, then reinstall.") + console.print( + f"Run [cyan]specify integration upgrade {key}[/cyan] to reinstall managed files, " + f"or [cyan]specify integration uninstall {key}[/cyan] first." + ) raise typer.Exit(0) - if installed_key: - console.print(f"[red]Error:[/red] Integration '{installed_key}' is already installed.") - console.print(f"Run [cyan]specify integration uninstall[/cyan] first, or use [cyan]specify integration switch {key}[/cyan].") - raise typer.Exit(1) + if installed_keys and not force: + unsafe_keys = [] + for installed_key in installed_keys: + installed_integration = get_integration(installed_key) + if not installed_integration or not getattr(installed_integration, "multi_install_safe", False): + unsafe_keys.append(installed_key) + if unsafe_keys or not getattr(integration, "multi_install_safe", False): + console.print( + f"[red]Error:[/red] Installed integrations: {', '.join(installed_keys)}." + ) + if default_key: + console.print(f"Default integration: [cyan]{default_key}[/cyan].") + console.print( + "Installing multiple integrations is only automatic when all involved " + "integrations are declared multi-install safe." + ) + console.print( + f"Run [cyan]specify integration switch {key}[/cyan] to replace the default " + f"integration, or retry with [cyan]--force[/cyan] to opt in." + ) + raise typer.Exit(1) selected_script = _resolve_script_type(project_root, script) # Build parsed options from --integration-options so the integration # can determine its effective invoke separator before shared infra # is installed. - parsed_options: dict[str, Any] | None = None - if integration_options: - parsed_options = _parse_integration_options(integration, integration_options) + raw_options, parsed_options = _resolve_integration_options( + integration, current, key, integration_options + ) # Ensure shared infrastructure is present (safe to run unconditionally; # _install_shared_infra merges missing files without overwriting). - _install_shared_infra(project_root, selected_script, invoke_separator=integration.effective_invoke_separator(parsed_options)) + infra_integration = integration + infra_key = key + infra_parsed = parsed_options + if default_key: + default_integration = get_integration(default_key) + if default_integration is not None: + infra_integration = default_integration + infra_key = default_key + _, infra_parsed = _resolve_integration_options( + default_integration, current, default_key, None + ) + _install_shared_infra( + project_root, + selected_script, + invoke_separator=_invoke_separator_for_integration( + infra_integration, current, infra_key, infra_parsed + ), + ) if os.name != "nt": ensure_executable_scripts(project_root) @@ -2110,11 +2229,22 @@ def integration_install( project_root, manifest, parsed_options=parsed_options, script_type=selected_script, - raw_options=integration_options, + raw_options=raw_options, ) manifest.save() - _write_integration_json(project_root, integration.key) - _update_init_options_for_integration(project_root, integration, script_type=selected_script) + new_installed = _dedupe_integration_keys([*installed_keys, integration.key]) + new_default = default_key or integration.key + settings = _with_integration_setting( + current, + integration.key, + integration, + script_type=selected_script, + raw_options=raw_options, + parsed_options=parsed_options, + ) + _write_integration_json(project_root, new_default, new_installed, settings) + if new_default == integration.key: + _update_init_options_for_integration(project_root, integration, script_type=selected_script) except Exception as e: # Attempt rollback of any files written by setup @@ -2123,12 +2253,19 @@ def integration_install( except Exception as rollback_err: # Suppress so the original setup error remains the primary failure console.print(f"[yellow]Warning:[/yellow] Failed to roll back integration changes: {rollback_err}") - _remove_integration_json(project_root) + if installed_keys: + _write_integration_json( + project_root, default_key, installed_keys, _integration_settings(current) + ) + else: + _remove_integration_json(project_root) console.print(f"[red]Error:[/red] Failed to install integration: {e}") raise typer.Exit(1) name = (integration.config or {}).get("name", key) console.print(f"\n[green]✓[/green] Integration '{name}' installed successfully") + if default_key: + console.print(f"[dim]Default integration remains:[/dim] [cyan]{default_key}[/cyan]") def _parse_integration_options(integration: Any, raw_options: str) -> dict[str, Any] | None: @@ -2200,6 +2337,51 @@ def _update_init_options_for_integration( save_init_options(project_root, opts) +@integration_app.command("use") +def integration_use( + key: str = typer.Argument(help="Installed integration key to make the default"), + force: bool = typer.Option(False, "--force", help="Overwrite managed shared templates while changing the default"), +): + """Set the default integration without uninstalling other integrations.""" + from .integrations import get_integration + + project_root = Path.cwd() + + specify_dir = project_root / ".specify" + if not specify_dir.exists(): + console.print("[red]Error:[/red] Not a spec-kit project (no .specify/ directory)") + console.print("Run this command from a spec-kit project root") + raise typer.Exit(1) + + current = _read_integration_json(project_root) + installed_keys = _installed_integration_keys(current) + if key not in installed_keys: + console.print(f"[red]Error:[/red] Integration '{key}' is not installed.") + if installed_keys: + console.print(f"[yellow]Installed integrations:[/yellow] {', '.join(installed_keys)}") + else: + console.print("Install one with: [cyan]specify integration install [/cyan]") + raise typer.Exit(1) + + integration = get_integration(key) + if integration is None: + console.print(f"[red]Error:[/red] Unknown integration '{key}'") + raise typer.Exit(1) + + raw_options, parsed_options = _resolve_integration_options(integration, current, key, None) + _set_default_integration( + project_root, + current, + key, + integration, + installed_keys, + raw_options=raw_options, + parsed_options=parsed_options, + refresh_templates_force=force, + ) + console.print(f"[green]✓[/green] Default integration set to [bold]{key}[/bold].") + + @integration_app.command("uninstall") def integration_uninstall( key: str = typer.Argument(None, help="Integration key to uninstall (default: current integration)"), @@ -2218,16 +2400,17 @@ def integration_uninstall( raise typer.Exit(1) current = _read_integration_json(project_root) - installed_key = current.get("integration") + default_key = _default_integration_key(current) + installed_keys = _installed_integration_keys(current) if key is None: - if not installed_key: + if not default_key: console.print("[yellow]No integration is currently installed.[/yellow]") raise typer.Exit(0) - key = installed_key + key = default_key - if installed_key and installed_key != key: - console.print(f"[red]Error:[/red] Integration '{key}' is not the currently installed integration ('{installed_key}').") + if key not in installed_keys: + console.print(f"[red]Error:[/red] Integration '{key}' is not installed.") raise typer.Exit(1) integration = get_integration(key) @@ -2235,20 +2418,35 @@ def integration_uninstall( manifest_path = project_root / ".specify" / "integrations" / f"{key}.manifest.json" if not manifest_path.exists(): console.print(f"[yellow]No manifest found for integration '{key}'. Nothing to uninstall.[/yellow]") - _remove_integration_json(project_root) - # Clear integration-related keys from init-options.json - opts = load_init_options(project_root) - if opts.get("integration") == key or opts.get("ai") == key: - opts.pop("integration", None) - opts.pop("ai", None) - opts.pop("ai_skills", None) - opts.pop("context_file", None) - save_init_options(project_root, opts) + remaining = [installed for installed in installed_keys if installed != key] + new_default = default_key if default_key != key else (remaining[0] if remaining else None) + if remaining: + if default_key == key and new_default and (new_integration := get_integration(new_default)): + raw_options, parsed_options = _resolve_integration_options( + new_integration, current, new_default, None + ) + _set_default_integration( + project_root, + current, + new_default, + new_integration, + remaining, + raw_options=raw_options, + parsed_options=parsed_options, + ) + else: + _write_integration_json( + project_root, new_default, remaining, _integration_settings(current) + ) + else: + _remove_integration_json(project_root) + if default_key == key: + _clear_init_options_for_integration(project_root, key) raise typer.Exit(0) try: manifest = IntegrationManifest.load(key, project_root) - except (ValueError, FileNotFoundError) as exc: + except _MANIFEST_READ_ERRORS as exc: console.print(f"[red]Error:[/red] Integration manifest for '{key}' is unreadable.") console.print(f"Manifest: {manifest_path}") console.print( @@ -2265,16 +2463,31 @@ def integration_uninstall( if integration: integration.remove_context_section(project_root) - _remove_integration_json(project_root) + remaining = [installed for installed in installed_keys if installed != key] + new_default = default_key if default_key != key else (remaining[0] if remaining else None) + if remaining: + if default_key == key and new_default and (new_integration := get_integration(new_default)): + raw_options, parsed_options = _resolve_integration_options( + new_integration, current, new_default, None + ) + _set_default_integration( + project_root, + current, + new_default, + new_integration, + remaining, + raw_options=raw_options, + parsed_options=parsed_options, + ) + else: + _write_integration_json( + project_root, new_default, remaining, _integration_settings(current) + ) + else: + _remove_integration_json(project_root) - # Update init-options.json to clear the integration - opts = load_init_options(project_root) - if opts.get("integration") == key or opts.get("ai") == key: - opts.pop("integration", None) - opts.pop("ai", None) - opts.pop("ai_skills", None) - opts.pop("context_file", None) - save_init_options(project_root, opts) + if default_key == key: + _clear_init_options_for_integration(project_root, key) name = (integration.config or {}).get("name", key) if integration else key console.print(f"\n[green]✓[/green] Integration '{name}' uninstalled") @@ -2314,10 +2527,67 @@ def integration_switch( raise typer.Exit(1) current = _read_integration_json(project_root) - installed_key = current.get("integration") + installed_keys = _installed_integration_keys(current) + installed_key = _default_integration_key(current) if installed_key == target: - console.print(f"[yellow]Integration '{target}' is already installed. Nothing to switch.[/yellow]") + if integration_options is not None: + console.print( + "[red]Error:[/red] --integration-options cannot be used when switching " + "to an already installed integration." + ) + console.print( + f"Run [cyan]specify integration upgrade {target} --integration-options ...[/cyan] " + "to update managed files/options." + ) + raise typer.Exit(1) + if force: + raw_options, parsed_options = _resolve_integration_options( + target_integration, current, target, None + ) + _set_default_integration( + project_root, + current, + target, + target_integration, + installed_keys, + raw_options=raw_options, + parsed_options=parsed_options, + refresh_templates_force=True, + ) + console.print( + f"\n[green]✓[/green] Default integration remains [bold]{target}[/bold]; " + "managed shared templates refreshed." + ) + raise typer.Exit(0) + console.print(f"[yellow]Integration '{target}' is already the default integration. Nothing to switch.[/yellow]") + raise typer.Exit(0) + + if target in installed_keys: + if integration_options is not None: + console.print( + "[red]Error:[/red] --integration-options cannot be used when switching " + "to an already installed integration." + ) + console.print( + f"Run [cyan]specify integration upgrade {target} --integration-options ...[/cyan] " + f"to update managed files/options, then [cyan]specify integration use {target}[/cyan]." + ) + raise typer.Exit(1) + raw_options, parsed_options = _resolve_integration_options( + target_integration, current, target, None + ) + _set_default_integration( + project_root, + current, + target, + target_integration, + installed_keys, + raw_options=raw_options, + parsed_options=parsed_options, + refresh_templates_force=force, + ) + console.print(f"\n[green]✓[/green] Default integration set to [bold]{target}[/bold].") raise typer.Exit(0) selected_script = _resolve_script_type(project_root, script) @@ -2331,7 +2601,7 @@ def integration_switch( console.print(f"Uninstalling current integration: [cyan]{installed_key}[/cyan]") try: old_manifest = IntegrationManifest.load(installed_key, project_root) - except (ValueError, FileNotFoundError) as exc: + except _MANIFEST_READ_ERRORS as exc: console.print(f"[red]Error:[/red] Could not read integration manifest for '{installed_key}': {manifest_path}") console.print(f"[dim]{exc}[/dim]") console.print( @@ -2355,7 +2625,7 @@ def integration_switch( console.print(f" Removed {len(removed)} file(s)") if skipped: console.print(f" [yellow]⚠[/yellow] {len(skipped)} modified file(s) preserved") - except (ValueError, FileNotFoundError) as exc: + except _MANIFEST_READ_ERRORS as exc: console.print(f"[yellow]Warning:[/yellow] Could not read manifest for '{installed_key}': {exc}") else: console.print(f"[red]Error:[/red] Integration '{installed_key}' is installed but has no manifest.") @@ -2366,24 +2636,48 @@ def integration_switch( raise typer.Exit(1) # Clear metadata so a failed Phase 2 doesn't leave stale references - _remove_integration_json(project_root) - opts = load_init_options(project_root) - opts.pop("integration", None) - opts.pop("ai", None) - opts.pop("ai_skills", None) - opts.pop("context_file", None) - save_init_options(project_root, opts) + installed_keys = [installed for installed in installed_keys if installed != installed_key] + _clear_init_options_for_integration(project_root, installed_key) + if installed_keys: + fallback_key = installed_keys[0] + fallback_integration = get_integration(fallback_key) + if fallback_integration is not None: + raw_options, parsed_options = _resolve_integration_options( + fallback_integration, current, fallback_key, None + ) + _set_default_integration( + project_root, + current, + fallback_key, + fallback_integration, + installed_keys, + raw_options=raw_options, + parsed_options=parsed_options, + ) + else: + _write_integration_json( + project_root, fallback_key, installed_keys, _integration_settings(current) + ) + else: + _remove_integration_json(project_root) + current = _read_integration_json(project_root) # Build parsed options from --integration-options so the integration # can determine its effective invoke separator before shared infra # is installed. - parsed_options: dict[str, Any] | None = None - if integration_options: - parsed_options = _parse_integration_options(target_integration, integration_options) + raw_options, parsed_options = _resolve_integration_options( + target_integration, current, target, integration_options + ) # Ensure shared infrastructure is present (safe to run unconditionally; # _install_shared_infra merges missing files without overwriting). - _install_shared_infra(project_root, selected_script, invoke_separator=target_integration.effective_invoke_separator(parsed_options)) + _install_shared_infra( + project_root, + selected_script, + invoke_separator=_invoke_separator_for_integration( + target_integration, current, target, parsed_options + ), + ) if os.name != "nt": ensure_executable_scripts(project_root) @@ -2398,11 +2692,19 @@ def integration_switch( project_root, manifest, parsed_options=parsed_options, script_type=selected_script, - raw_options=integration_options, + raw_options=raw_options, ) manifest.save() - _write_integration_json(project_root, target_integration.key) - _update_init_options_for_integration(project_root, target_integration, script_type=selected_script) + _set_default_integration( + project_root, + current, + target_integration.key, + target_integration, + _dedupe_integration_keys([*installed_keys, target_integration.key]), + script_type=selected_script, + raw_options=raw_options, + parsed_options=parsed_options, + ) except Exception as e: # Attempt rollback of any files written by setup @@ -2411,7 +2713,28 @@ def integration_switch( except Exception as rollback_err: # Suppress so the original setup error remains the primary failure console.print(f"[yellow]Warning:[/yellow] Failed to roll back integration '{target}': {rollback_err}") - _remove_integration_json(project_root) + if installed_keys: + fallback_key = installed_keys[0] + fallback_integration = get_integration(fallback_key) + if fallback_integration is not None: + raw_options, parsed_options = _resolve_integration_options( + fallback_integration, current, fallback_key, None + ) + _set_default_integration( + project_root, + current, + fallback_key, + fallback_integration, + installed_keys, + raw_options=raw_options, + parsed_options=parsed_options, + ) + else: + _write_integration_json( + project_root, fallback_key, installed_keys, _integration_settings(current) + ) + else: + _remove_integration_json(project_root) console.print(f"[red]Error:[/red] Failed to install integration '{target}': {e}") raise typer.Exit(1) @@ -2443,7 +2766,8 @@ def integration_upgrade( raise typer.Exit(1) current = _read_integration_json(project_root) - installed_key = current.get("integration") + installed_key = _default_integration_key(current) + installed_keys = _installed_integration_keys(current) if key is None: if not installed_key: @@ -2451,11 +2775,8 @@ def integration_upgrade( raise typer.Exit(0) key = installed_key - if installed_key and installed_key != key: - console.print( - f"[red]Error:[/red] Integration '{key}' is not the currently installed integration ('{installed_key}')." - ) - console.print(f"Use [cyan]specify integration switch {key}[/cyan] instead.") + if key not in installed_keys: + console.print(f"[red]Error:[/red] Integration '{key}' is not installed.") raise typer.Exit(1) integration = get_integration(key) @@ -2471,7 +2792,7 @@ def integration_upgrade( try: old_manifest = IntegrationManifest.load(key, project_root) - except (ValueError, FileNotFoundError) as exc: + except _MANIFEST_READ_ERRORS as exc: console.print(f"[red]Error:[/red] Integration manifest for '{key}' is unreadable: {exc}") raise typer.Exit(1) @@ -2484,17 +2805,35 @@ def integration_upgrade( console.print("\nUse [cyan]--force[/cyan] to overwrite modified files, or resolve manually.") raise typer.Exit(1) - selected_script = _resolve_script_type(project_root, script) + selected_script = _resolve_integration_script_type(project_root, current, key, script) # Build parsed options from --integration-options so the integration # can determine its effective invoke separator before shared infra # is installed. - parsed_options: dict[str, Any] | None = None - if integration_options: - parsed_options = _parse_integration_options(integration, integration_options) + raw_options, parsed_options = _resolve_integration_options( + integration, current, key, integration_options + ) # Ensure shared infrastructure is up to date; --force overwrites existing files. - _install_shared_infra(project_root, selected_script, force=force, invoke_separator=integration.effective_invoke_separator(parsed_options)) + infra_integration = integration + infra_key = key + infra_parsed = parsed_options + if installed_key and installed_key != key: + default_integration = get_integration(installed_key) + if default_integration is not None: + infra_integration = default_integration + infra_key = installed_key + _, infra_parsed = _resolve_integration_options( + default_integration, current, installed_key, None + ) + _install_shared_infra( + project_root, + selected_script, + force=force, + invoke_separator=_invoke_separator_for_integration( + infra_integration, current, infra_key, infra_parsed + ), + ) if os.name != "nt": ensure_executable_scripts(project_root) @@ -2508,11 +2847,27 @@ def integration_upgrade( new_manifest, parsed_options=parsed_options, script_type=selected_script, - raw_options=integration_options, + raw_options=raw_options, ) new_manifest.save() - _write_integration_json(project_root, key) - _update_init_options_for_integration(project_root, integration, script_type=selected_script) + settings = _with_integration_setting( + current, + key, + integration, + script_type=selected_script, + raw_options=raw_options, + parsed_options=parsed_options, + ) + _write_integration_json(project_root, installed_key, installed_keys, settings) + if installed_key == key: + _update_init_options_for_integration(project_root, integration, script_type=selected_script) + _refresh_shared_templates( + project_root, + invoke_separator=_invoke_separator_for_integration( + integration, {"integration_settings": settings}, key, parsed_options + ), + force=force, + ) except Exception as exc: # Don't teardown — setup overwrites in-place, so teardown would # delete files that were working before the upgrade. Just report. diff --git a/src/specify_cli/integration_runtime.py b/src/specify_cli/integration_runtime.py new file mode 100644 index 0000000000..a36dcc672c --- /dev/null +++ b/src/specify_cli/integration_runtime.py @@ -0,0 +1,90 @@ +"""Runtime helpers for integration commands.""" + +from __future__ import annotations + +from collections.abc import Callable +from typing import Any + +from .integration_state import integration_setting, integration_settings + + +ParseOptions = Callable[[Any, str], dict[str, Any] | None] + + +def resolve_integration_options( + integration: Any, + state: dict[str, Any], + key: str, + raw_options: str | None, + *, + parse_options: ParseOptions, +) -> tuple[str | None, dict[str, Any] | None]: + """Resolve raw and parsed options for an integration operation.""" + if raw_options is not None: + return raw_options, parse_options(integration, raw_options) + + setting = integration_setting(state, key) + stored_raw = setting.get("raw_options") + if not isinstance(stored_raw, str): + stored_raw = None + + stored_parsed = setting.get("parsed_options") + if isinstance(stored_parsed, dict): + return stored_raw, stored_parsed or None + + if stored_raw: + return stored_raw, parse_options(integration, stored_raw) + + return None, None + + +def with_integration_setting( + state: dict[str, Any], + key: str, + integration: Any, + *, + script_type: str | None = None, + raw_options: str | None = None, + parsed_options: dict[str, Any] | None = None, +) -> dict[str, dict[str, Any]]: + """Return integration settings with *key* updated.""" + settings = integration_settings(state) + current = dict(settings.get(key, {})) + + if script_type: + current["script"] = script_type + if raw_options is not None: + current["raw_options"] = raw_options + elif "raw_options" in current and not current.get("raw_options"): + current.pop("raw_options", None) + + if parsed_options is not None: + current["parsed_options"] = parsed_options + elif raw_options is not None: + current.pop("parsed_options", None) + + current["invoke_separator"] = integration.effective_invoke_separator(parsed_options) + settings[key] = current + return settings + + +def invoke_separator_for_integration( + integration: Any, + state: dict[str, Any], + key: str, + parsed_options: dict[str, Any] | None = None, +) -> str: + """Resolve the invocation separator for stored/default integration state.""" + if parsed_options is not None: + return integration.effective_invoke_separator(parsed_options) + + setting = integration_setting(state, key) + stored_separator = setting.get("invoke_separator") + if isinstance(stored_separator, str) and stored_separator: + return stored_separator + + stored_parsed = setting.get("parsed_options") + if isinstance(stored_parsed, dict): + return integration.effective_invoke_separator(stored_parsed) + + return integration.effective_invoke_separator(None) diff --git a/src/specify_cli/integration_state.py b/src/specify_cli/integration_state.py new file mode 100644 index 0000000000..e697d7e540 --- /dev/null +++ b/src/specify_cli/integration_state.py @@ -0,0 +1,153 @@ +"""State helpers for installed AI agent integrations.""" + +from __future__ import annotations + +import json +from pathlib import Path +from typing import Any + + +INTEGRATION_JSON = ".specify/integration.json" +INTEGRATION_STATE_SCHEMA = 1 + + +def clean_integration_key(key: Any) -> str | None: + """Return a stripped integration key, or None for empty/non-string values.""" + if not isinstance(key, str) or not key.strip(): + return None + return key.strip() + + +def dedupe_integration_keys(keys: list[Any]) -> list[str]: + """Return a de-duplicated list of non-empty integration keys.""" + seen: set[str] = set() + deduped: list[str] = [] + for key in keys: + clean = clean_integration_key(key) + if clean is None: + continue + if clean in seen: + continue + seen.add(clean) + deduped.append(clean) + return deduped + + +def normalize_integration_settings(settings: Any) -> dict[str, dict[str, Any]]: + """Return JSON-safe per-integration runtime settings.""" + if not isinstance(settings, dict): + return {} + + normalized: dict[str, dict[str, Any]] = {} + for key, value in settings.items(): + if not isinstance(key, str) or not key.strip() or not isinstance(value, dict): + continue + + clean: dict[str, Any] = {} + script = value.get("script") + if isinstance(script, str) and script.strip(): + clean["script"] = script.strip() + + raw_options = value.get("raw_options") + if isinstance(raw_options, str): + clean["raw_options"] = raw_options + + parsed_options = value.get("parsed_options") + if isinstance(parsed_options, dict): + clean["parsed_options"] = parsed_options + + invoke_separator = value.get("invoke_separator") + if isinstance(invoke_separator, str) and invoke_separator.strip(): + clean["invoke_separator"] = invoke_separator.strip() + + if clean: + normalized[key.strip()] = clean + + return normalized + + +def normalize_integration_state(data: dict[str, Any]) -> dict[str, Any]: + """Normalize legacy and multi-install integration metadata.""" + legacy_key = clean_integration_key(data.get("integration")) + default_key = clean_integration_key(data.get("default_integration")) or legacy_key + + installed = data.get("installed_integrations") + installed_keys = dedupe_integration_keys(installed if isinstance(installed, list) else []) + if not default_key and installed_keys: + default_key = installed_keys[0] + if default_key and default_key not in installed_keys: + installed_keys.insert(0, default_key) + + settings = normalize_integration_settings(data.get("integration_settings")) + + normalized = dict(data) + normalized["integration_state_schema"] = INTEGRATION_STATE_SCHEMA + if default_key: + normalized["integration"] = default_key + normalized["default_integration"] = default_key + else: + normalized.pop("integration", None) + normalized.pop("default_integration", None) + normalized["installed_integrations"] = installed_keys + normalized["integration_settings"] = { + key: settings[key] for key in installed_keys if key in settings + } + return normalized + + +def default_integration_key(state: dict[str, Any]) -> str | None: + """Return the default integration key from normalized state.""" + key = state.get("default_integration") or state.get("integration") + return clean_integration_key(key) + + +def installed_integration_keys(state: dict[str, Any]) -> list[str]: + """Return installed integration keys from normalized state.""" + return dedupe_integration_keys(state.get("installed_integrations", [])) + + +def integration_settings(state: dict[str, Any]) -> dict[str, dict[str, Any]]: + """Return normalized per-integration settings from state.""" + return normalize_integration_settings(state.get("integration_settings")) + + +def integration_setting(state: dict[str, Any], key: str) -> dict[str, Any]: + """Return stored runtime settings for *key*.""" + return dict(integration_settings(state).get(key, {})) + + +def write_integration_json( + project_root: Path, + *, + version: str, + integration_key: str | None, + installed_integrations: list[str] | None = None, + settings: dict[str, dict[str, Any]] | None = None, +) -> None: + """Write ``.specify/integration.json`` with legacy-compatible state.""" + dest = project_root / INTEGRATION_JSON + dest.parent.mkdir(parents=True, exist_ok=True) + + integration_key = clean_integration_key(integration_key) + installed = dedupe_integration_keys(installed_integrations or []) + if integration_key and integration_key not in installed: + installed.insert(0, integration_key) + if not integration_key and installed: + integration_key = installed[0] + + normalized_settings = normalize_integration_settings(settings or {}) + normalized_settings = { + key: normalized_settings[key] for key in installed if key in normalized_settings + } + + data: dict[str, Any] = { + "version": version, + "integration_state_schema": INTEGRATION_STATE_SCHEMA, + "installed_integrations": installed, + "integration_settings": normalized_settings, + } + if integration_key: + data["integration"] = integration_key + data["default_integration"] = integration_key + + dest.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8") diff --git a/src/specify_cli/integrations/auggie/__init__.py b/src/specify_cli/integrations/auggie/__init__.py index 9715e936ef..08e20fbc25 100644 --- a/src/specify_cli/integrations/auggie/__init__.py +++ b/src/specify_cli/integrations/auggie/__init__.py @@ -19,3 +19,4 @@ class AuggieIntegration(MarkdownIntegration): "extension": ".md", } context_file = ".augment/rules/specify-rules.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/base.py b/src/specify_cli/integrations/base.py index f3b74b0c05..c46340ddff 100644 --- a/src/specify_cli/integrations/base.py +++ b/src/specify_cli/integrations/base.py @@ -87,6 +87,14 @@ class IntegrationBase(ABC): invoke_separator: str = "." """Separator used in slash-command invocations (``"."`` → ``/speckit.plan``).""" + multi_install_safe: bool = False + """Whether this integration is declared safe to install alongside others. + + Safe integrations must use a static, unique agent root, command directory, + and context file. Registry tests enforce those invariants for every + integration that sets this flag. + """ + # -- Markers for managed context section ------------------------------ CONTEXT_MARKER_START = "" diff --git a/src/specify_cli/integrations/claude/__init__.py b/src/specify_cli/integrations/claude/__init__.py index 3e39db717e..88aef85285 100644 --- a/src/specify_cli/integrations/claude/__init__.py +++ b/src/specify_cli/integrations/claude/__init__.py @@ -53,6 +53,7 @@ class ClaudeIntegration(SkillsIntegration): "extension": "/SKILL.md", } context_file = "CLAUDE.md" + multi_install_safe = True @staticmethod def inject_argument_hint(content: str, hint: str) -> str: diff --git a/src/specify_cli/integrations/codebuddy/__init__.py b/src/specify_cli/integrations/codebuddy/__init__.py index 061ac7641f..980ac7fed7 100644 --- a/src/specify_cli/integrations/codebuddy/__init__.py +++ b/src/specify_cli/integrations/codebuddy/__init__.py @@ -19,3 +19,4 @@ class CodebuddyIntegration(MarkdownIntegration): "extension": ".md", } context_file = "CODEBUDDY.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/codex/__init__.py b/src/specify_cli/integrations/codex/__init__.py index b3b509b654..1c24a84bd2 100644 --- a/src/specify_cli/integrations/codex/__init__.py +++ b/src/specify_cli/integrations/codex/__init__.py @@ -27,6 +27,7 @@ class CodexIntegration(SkillsIntegration): "extension": "/SKILL.md", } context_file = "AGENTS.md" + multi_install_safe = True def build_exec_args( self, diff --git a/src/specify_cli/integrations/cursor_agent/__init__.py b/src/specify_cli/integrations/cursor_agent/__init__.py index a5472654fa..70af454ce9 100644 --- a/src/specify_cli/integrations/cursor_agent/__init__.py +++ b/src/specify_cli/integrations/cursor_agent/__init__.py @@ -26,6 +26,7 @@ class CursorAgentIntegration(SkillsIntegration): } context_file = ".cursor/rules/specify-rules.mdc" + multi_install_safe = True @classmethod def options(cls) -> list[IntegrationOption]: diff --git a/src/specify_cli/integrations/gemini/__init__.py b/src/specify_cli/integrations/gemini/__init__.py index d66f0b80bc..7c6fe159c7 100644 --- a/src/specify_cli/integrations/gemini/__init__.py +++ b/src/specify_cli/integrations/gemini/__init__.py @@ -19,3 +19,4 @@ class GeminiIntegration(TomlIntegration): "extension": ".toml", } context_file = "GEMINI.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/iflow/__init__.py b/src/specify_cli/integrations/iflow/__init__.py index 4acc2cf372..65d4d21c63 100644 --- a/src/specify_cli/integrations/iflow/__init__.py +++ b/src/specify_cli/integrations/iflow/__init__.py @@ -19,3 +19,4 @@ class IflowIntegration(MarkdownIntegration): "extension": ".md", } context_file = "IFLOW.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/junie/__init__.py b/src/specify_cli/integrations/junie/__init__.py index 0cc3b3f0ff..98d0494a8a 100644 --- a/src/specify_cli/integrations/junie/__init__.py +++ b/src/specify_cli/integrations/junie/__init__.py @@ -19,3 +19,4 @@ class JunieIntegration(MarkdownIntegration): "extension": ".md", } context_file = ".junie/AGENTS.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/kilocode/__init__.py b/src/specify_cli/integrations/kilocode/__init__.py index ffd38f741a..11674dd9f1 100644 --- a/src/specify_cli/integrations/kilocode/__init__.py +++ b/src/specify_cli/integrations/kilocode/__init__.py @@ -19,3 +19,4 @@ class KilocodeIntegration(MarkdownIntegration): "extension": ".md", } context_file = ".kilocode/rules/specify-rules.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/kimi/__init__.py b/src/specify_cli/integrations/kimi/__init__.py index 5421d48012..3b257768e2 100644 --- a/src/specify_cli/integrations/kimi/__init__.py +++ b/src/specify_cli/integrations/kimi/__init__.py @@ -36,6 +36,7 @@ class KimiIntegration(SkillsIntegration): "extension": "/SKILL.md", } context_file = "KIMI.md" + multi_install_safe = True @classmethod def options(cls) -> list[IntegrationOption]: diff --git a/src/specify_cli/integrations/qodercli/__init__.py b/src/specify_cli/integrations/qodercli/__init__.py index 541001be17..ee2d4b6255 100644 --- a/src/specify_cli/integrations/qodercli/__init__.py +++ b/src/specify_cli/integrations/qodercli/__init__.py @@ -19,3 +19,4 @@ class QodercliIntegration(MarkdownIntegration): "extension": ".md", } context_file = "QODER.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/qwen/__init__.py b/src/specify_cli/integrations/qwen/__init__.py index d9d930152c..2506a57681 100644 --- a/src/specify_cli/integrations/qwen/__init__.py +++ b/src/specify_cli/integrations/qwen/__init__.py @@ -19,3 +19,4 @@ class QwenIntegration(MarkdownIntegration): "extension": ".md", } context_file = "QWEN.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/roo/__init__.py b/src/specify_cli/integrations/roo/__init__.py index 3c680e7e35..f610a3cc63 100644 --- a/src/specify_cli/integrations/roo/__init__.py +++ b/src/specify_cli/integrations/roo/__init__.py @@ -19,3 +19,4 @@ class RooIntegration(MarkdownIntegration): "extension": ".md", } context_file = ".roo/rules/specify-rules.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/shai/__init__.py b/src/specify_cli/integrations/shai/__init__.py index 7a9d1deb02..123953da72 100644 --- a/src/specify_cli/integrations/shai/__init__.py +++ b/src/specify_cli/integrations/shai/__init__.py @@ -19,3 +19,4 @@ class ShaiIntegration(MarkdownIntegration): "extension": ".md", } context_file = "SHAI.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/tabnine/__init__.py b/src/specify_cli/integrations/tabnine/__init__.py index 2928a214a7..0d0076bc56 100644 --- a/src/specify_cli/integrations/tabnine/__init__.py +++ b/src/specify_cli/integrations/tabnine/__init__.py @@ -19,3 +19,4 @@ class TabnineIntegration(TomlIntegration): "extension": ".toml", } context_file = "TABNINE.md" + multi_install_safe = True diff --git a/src/specify_cli/integrations/trae/__init__.py b/src/specify_cli/integrations/trae/__init__.py index 343a7527f8..4556487d07 100644 --- a/src/specify_cli/integrations/trae/__init__.py +++ b/src/specify_cli/integrations/trae/__init__.py @@ -27,6 +27,7 @@ class TraeIntegration(SkillsIntegration): "extension": "/SKILL.md", } context_file = ".trae/rules/project_rules.md" + multi_install_safe = True @classmethod def options(cls) -> list[IntegrationOption]: diff --git a/src/specify_cli/integrations/windsurf/__init__.py b/src/specify_cli/integrations/windsurf/__init__.py index f0f77d318e..ae5c3301f4 100644 --- a/src/specify_cli/integrations/windsurf/__init__.py +++ b/src/specify_cli/integrations/windsurf/__init__.py @@ -19,3 +19,4 @@ class WindsurfIntegration(MarkdownIntegration): "extension": ".md", } context_file = ".windsurf/rules/specify-rules.md" + multi_install_safe = True diff --git a/src/specify_cli/shared_infra.py b/src/specify_cli/shared_infra.py new file mode 100644 index 0000000000..ca0e69a0f7 --- /dev/null +++ b/src/specify_cli/shared_infra.py @@ -0,0 +1,220 @@ +"""Shared Spec Kit infrastructure installation helpers.""" + +from __future__ import annotations + +import shutil +from pathlib import Path +from typing import Any + +from .integrations.base import IntegrationBase +from .integrations.manifest import IntegrationManifest + + +def load_speckit_manifest( + project_path: Path, + *, + version: str, + console: Any | None = None, +) -> IntegrationManifest: + """Load the shared infrastructure manifest, preserving existing entries.""" + manifest_path = project_path / ".specify" / "integrations" / "speckit.manifest.json" + if manifest_path.exists(): + try: + manifest = IntegrationManifest.load("speckit", project_path) + manifest.version = version + return manifest + except (ValueError, FileNotFoundError, OSError, UnicodeDecodeError) as exc: + if console is not None: + console.print( + f"[yellow]Warning:[/yellow] Could not read shared infrastructure " + f"manifest at {manifest_path}: {exc}" + ) + console.print( + "A new shared manifest will be created; previously tracked " + "shared files may be treated as untracked." + ) + return IntegrationManifest("speckit", project_path, version=version) + + +def shared_templates_source( + *, + core_pack: Path | None, + repo_root: Path, +) -> Path: + """Return the bundled/source shared templates directory.""" + if core_pack and (core_pack / "templates").is_dir(): + return core_pack / "templates" + return repo_root / "templates" + + +def shared_scripts_source( + *, + core_pack: Path | None, + repo_root: Path, +) -> Path: + """Return the bundled/source shared scripts directory.""" + if core_pack and (core_pack / "scripts").is_dir(): + return core_pack / "scripts" + return repo_root / "scripts" + + +def _shared_destination_label(project_path: Path, dest: Path) -> str: + try: + return dest.relative_to(project_path).as_posix() + except ValueError: + return str(dest) + + +def _ensure_safe_shared_destination(project_path: Path, dest: Path) -> None: + """Refuse shared infra writes that would escape or follow symlinks.""" + root = project_path.resolve() + try: + dest.parent.resolve().relative_to(root) + except (OSError, ValueError): + label = _shared_destination_label(project_path, dest) + raise ValueError(f"Shared infrastructure destination escapes project root: {label}") from None + + label = _shared_destination_label(project_path, dest) + if dest.is_symlink(): + raise ValueError(f"Refusing to overwrite symlinked shared infrastructure path: {label}") + + if dest.exists(): + try: + dest.resolve().relative_to(root) + except (OSError, ValueError): + raise ValueError(f"Shared infrastructure destination escapes project root: {label}") from None + + +def _write_shared_text(project_path: Path, dest: Path, content: str) -> None: + _ensure_safe_shared_destination(project_path, dest) + dest.write_text(content, encoding="utf-8") + + +def _copy_shared_file(project_path: Path, src: Path, dest: Path) -> None: + _ensure_safe_shared_destination(project_path, dest) + shutil.copy2(src, dest) + + +def refresh_shared_templates( + project_path: Path, + *, + version: str, + core_pack: Path | None, + repo_root: Path, + console: Any, + invoke_separator: str, + force: bool = False, +) -> None: + """Refresh default-sensitive shared templates without touching scripts.""" + templates_src = shared_templates_source(core_pack=core_pack, repo_root=repo_root) + if not templates_src.is_dir(): + return + + manifest = load_speckit_manifest(project_path, version=version, console=console) + tracked_files = manifest.files + modified = set(manifest.check_modified()) + skipped_files: list[str] = [] + + dest_templates = project_path / ".specify" / "templates" + dest_templates.mkdir(parents=True, exist_ok=True) + for src in templates_src.iterdir(): + if not src.is_file() or src.name == "vscode-settings.json" or src.name.startswith("."): + continue + + dst = dest_templates / src.name + _ensure_safe_shared_destination(project_path, dst) + rel = dst.relative_to(project_path).as_posix() + if dst.exists() and not force: + if rel not in tracked_files or rel in modified: + skipped_files.append(rel) + continue + + content = src.read_text(encoding="utf-8") + content = IntegrationBase.resolve_command_refs(content, invoke_separator) + _write_shared_text(project_path, dst, content) + manifest.record_existing(rel) + + manifest.save() + + if skipped_files: + console.print( + f"[yellow]⚠[/yellow] {len(skipped_files)} modified or untracked shared template file(s) were not updated:" + ) + for rel in skipped_files: + console.print(f" {rel}") + + +def install_shared_infra( + project_path: Path, + script_type: str, + *, + version: str, + core_pack: Path | None, + repo_root: Path, + console: Any, + force: bool = False, + invoke_separator: str = ".", +) -> bool: + """Install shared scripts and templates into *project_path*.""" + manifest = load_speckit_manifest(project_path, version=version, console=console) + skipped_files: list[str] = [] + + scripts_src = shared_scripts_source(core_pack=core_pack, repo_root=repo_root) + if scripts_src.is_dir(): + dest_scripts = project_path / ".specify" / "scripts" + dest_scripts.mkdir(parents=True, exist_ok=True) + variant_dir = "bash" if script_type == "sh" else "powershell" + variant_src = scripts_src / variant_dir + if variant_src.is_dir(): + dest_variant = dest_scripts / variant_dir + dest_variant.mkdir(parents=True, exist_ok=True) + for src_path in variant_src.rglob("*"): + if not src_path.is_file(): + continue + + rel_path = src_path.relative_to(variant_src) + dst_path = dest_variant / rel_path + _ensure_safe_shared_destination(project_path, dst_path) + if dst_path.exists() and not force: + skipped_files.append(str(dst_path.relative_to(project_path))) + continue + + dst_path.parent.mkdir(parents=True, exist_ok=True) + _copy_shared_file(project_path, src_path, dst_path) + rel = dst_path.relative_to(project_path).as_posix() + manifest.record_existing(rel) + + templates_src = shared_templates_source(core_pack=core_pack, repo_root=repo_root) + if templates_src.is_dir(): + dest_templates = project_path / ".specify" / "templates" + dest_templates.mkdir(parents=True, exist_ok=True) + for src in templates_src.iterdir(): + if not src.is_file() or src.name == "vscode-settings.json" or src.name.startswith("."): + continue + + dst = dest_templates / src.name + _ensure_safe_shared_destination(project_path, dst) + if dst.exists() and not force: + skipped_files.append(str(dst.relative_to(project_path))) + continue + + content = src.read_text(encoding="utf-8") + content = IntegrationBase.resolve_command_refs(content, invoke_separator) + _write_shared_text(project_path, dst, content) + rel = dst.relative_to(project_path).as_posix() + manifest.record_existing(rel) + + if skipped_files: + console.print( + f"[yellow]⚠[/yellow] {len(skipped_files)} shared infrastructure file(s) already exist and were not updated:" + ) + for path in skipped_files: + console.print(f" {path}") + console.print( + "To refresh shared infrastructure, run " + "[cyan]specify init --here --force[/cyan] or " + "[cyan]specify integration upgrade --force[/cyan]." + ) + + manifest.save() + return True diff --git a/tests/integrations/test_cli.py b/tests/integrations/test_cli.py index df48323ed2..bb71e7cda1 100644 --- a/tests/integrations/test_cli.py +++ b/tests/integrations/test_cli.py @@ -3,6 +3,7 @@ import json import os +import pytest import yaml from tests.conftest import strip_ansi @@ -254,6 +255,100 @@ def test_shared_infra_skip_warning_displayed(self, tmp_path, capsys): normalized = " ".join(captured.out.split()) assert "specify integration upgrade --force" in normalized + def test_shared_infra_warns_when_manifest_cannot_be_loaded(self, tmp_path, capsys): + """Invalid shared manifests warn before falling back to a new manifest.""" + from specify_cli import _install_shared_infra + + project = tmp_path / "bad-shared-manifest-test" + project.mkdir() + integrations_dir = project / ".specify" / "integrations" + integrations_dir.mkdir(parents=True) + manifest_path = integrations_dir / "speckit.manifest.json" + manifest_path.write_text("{not json", encoding="utf-8") + + _install_shared_infra(project, "sh") + + captured = capsys.readouterr() + assert "Could not read shared infrastructure manifest" in captured.out + assert "A new shared manifest will be created" in captured.out + + def test_shared_infra_warns_when_manifest_cannot_be_decoded(self, tmp_path, capsys): + """Non-UTF-8 shared manifests warn before falling back to a new manifest.""" + from specify_cli import _install_shared_infra + + project = tmp_path / "bad-shared-manifest-encoding-test" + project.mkdir() + integrations_dir = project / ".specify" / "integrations" + integrations_dir.mkdir(parents=True) + manifest_path = integrations_dir / "speckit.manifest.json" + manifest_path.write_bytes(b"\xff\xfe\x00") + + _install_shared_infra(project, "sh") + + captured = capsys.readouterr() + assert "Could not read shared infrastructure manifest" in captured.out + assert "A new shared manifest will be created" in captured.out + + @pytest.mark.skipif(not hasattr(os, "symlink"), reason="symlinks are unavailable") + def test_shared_infra_refuses_symlinked_script_destination(self, tmp_path): + """Shared script refreshes must not follow destination symlinks.""" + from specify_cli import _install_shared_infra + + project = tmp_path / "symlink-script-test" + project.mkdir() + (project / ".specify").mkdir() + + outside = tmp_path / "outside-script.sh" + outside.write_text("# outside\n", encoding="utf-8") + scripts_dir = project / ".specify" / "scripts" / "bash" + scripts_dir.mkdir(parents=True) + os.symlink(outside, scripts_dir / "common.sh") + + with pytest.raises(ValueError, match="Refusing to overwrite symlinked"): + _install_shared_infra(project, "sh", force=True) + + assert outside.read_text(encoding="utf-8") == "# outside\n" + + @pytest.mark.skipif(not hasattr(os, "symlink"), reason="symlinks are unavailable") + def test_shared_infra_refuses_symlinked_template_destination(self, tmp_path): + """Shared template installs must not follow destination symlinks.""" + from specify_cli import _install_shared_infra + + project = tmp_path / "symlink-template-test" + project.mkdir() + (project / ".specify").mkdir() + + outside = tmp_path / "outside-template.md" + outside.write_text("# outside\n", encoding="utf-8") + templates_dir = project / ".specify" / "templates" + templates_dir.mkdir(parents=True) + os.symlink(outside, templates_dir / "plan-template.md") + + with pytest.raises(ValueError, match="Refusing to overwrite symlinked"): + _install_shared_infra(project, "sh", force=True) + + assert outside.read_text(encoding="utf-8") == "# outside\n" + + @pytest.mark.skipif(not hasattr(os, "symlink"), reason="symlinks are unavailable") + def test_shared_template_refresh_refuses_symlinked_destination(self, tmp_path): + """Template-only refreshes must not follow destination symlinks.""" + from specify_cli import _refresh_shared_templates + + project = tmp_path / "symlink-refresh-test" + project.mkdir() + (project / ".specify").mkdir() + + outside = tmp_path / "outside-refresh.md" + outside.write_text("# outside\n", encoding="utf-8") + templates_dir = project / ".specify" / "templates" + templates_dir.mkdir(parents=True) + os.symlink(outside, templates_dir / "plan-template.md") + + with pytest.raises(ValueError, match="Refusing to overwrite symlinked"): + _refresh_shared_templates(project, invoke_separator=".", force=True) + + assert outside.read_text(encoding="utf-8") == "# outside\n" + def test_shared_infra_no_warning_when_forced(self, tmp_path, capsys): """No skip warning when force=True (all files overwritten).""" from specify_cli import _install_shared_infra diff --git a/tests/integrations/test_integration_catalog.py b/tests/integrations/test_integration_catalog.py index 6d82a6c390..a43933ae00 100644 --- a/tests/integrations/test_integration_catalog.py +++ b/tests/integrations/test_integration_catalog.py @@ -632,7 +632,7 @@ def test_upgrade_wrong_integration_key(self, tmp_path): finally: os.chdir(old) assert result.exit_code != 0 - assert "not the currently installed integration" in result.output + assert "not installed" in result.output def test_upgrade_no_manifest(self, tmp_path): """Upgrade with missing manifest suggests fresh install.""" diff --git a/tests/integrations/test_integration_state.py b/tests/integrations/test_integration_state.py new file mode 100644 index 0000000000..35c2f0c809 --- /dev/null +++ b/tests/integrations/test_integration_state.py @@ -0,0 +1,72 @@ +"""Tests for integration state normalization helpers.""" + +import json + +from specify_cli.integration_state import ( + INTEGRATION_JSON, + default_integration_key, + integration_setting, + normalize_integration_state, + write_integration_json, +) + + +def test_normalize_integration_state_strips_default_key_without_duplicates(): + state = normalize_integration_state( + { + "default_integration": " claude ", + "integration": " claude ", + "installed_integrations": ["claude"], + } + ) + + assert state["integration"] == "claude" + assert state["default_integration"] == "claude" + assert state["installed_integrations"] == ["claude"] + + +def test_normalize_integration_state_strips_legacy_key_fallback(): + state = normalize_integration_state( + { + "integration": " codex ", + "installed_integrations": [], + } + ) + + assert state["integration"] == "codex" + assert state["default_integration"] == "codex" + assert state["installed_integrations"] == ["codex"] + + +def test_default_integration_key_strips_raw_state_values(): + assert default_integration_key({"default_integration": " claude "}) == "claude" + assert default_integration_key({"integration": " codex "}) == "codex" + + +def test_integration_settings_strip_invoke_separator(): + setting = integration_setting( + { + "integration_settings": { + "claude": { + "invoke_separator": " - ", + } + } + }, + "claude", + ) + + assert setting["invoke_separator"] == "-" + + +def test_write_integration_json_strips_integration_key(tmp_path): + write_integration_json( + tmp_path, + version="1.2.3", + integration_key=" claude ", + installed_integrations=["claude"], + ) + + state = json.loads((tmp_path / INTEGRATION_JSON).read_text(encoding="utf-8")) + assert state["integration"] == "claude" + assert state["default_integration"] == "claude" + assert state["installed_integrations"] == ["claude"] diff --git a/tests/integrations/test_integration_subcommand.py b/tests/integrations/test_integration_subcommand.py index f5322bdf5e..ae6a615a57 100644 --- a/tests/integrations/test_integration_subcommand.py +++ b/tests/integrations/test_integration_subcommand.py @@ -31,6 +31,17 @@ def _init_project(tmp_path, integration="copilot"): return project +def _write_invalid_manifest(project, key): + manifest = project / ".specify" / "integrations" / f"{key}.manifest.json" + manifest.write_bytes(b"\xff\xfe\x00") + return manifest + + +def _integration_list_row_cells(output: str, key: str) -> list[str]: + row = next(line for line in output.splitlines() if line.startswith(f"│ {key}")) + return [cell.strip() for cell in row.split("│")[1:-1]] + + # ── list ───────────────────────────────────────────────────────────── @@ -70,6 +81,20 @@ def test_list_shows_available_integrations(self, tmp_path): assert "claude" in result.output assert "gemini" in result.output + def test_list_shows_multi_install_safe_status(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, ["integration", "list"]) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0 + assert "Multi-install" in result.output + assert "Safe" in result.output + assert _integration_list_row_cells(result.output, "claude")[-1] == "yes" + assert _integration_list_row_cells(result.output, "copilot")[-1] == "no" + # ── install ────────────────────────────────────────────────────────── @@ -106,7 +131,9 @@ def test_install_already_installed(self, tmp_path): os.chdir(old_cwd) assert result.exit_code == 0 assert "already installed" in result.output - assert "uninstall" in result.output + normalized = " ".join(result.output.split()) + assert "specify integration upgrade copilot" in normalized + assert "specify integration uninstall copilot" in normalized def test_install_different_when_one_exists(self, tmp_path): project = _init_project(tmp_path, "copilot") @@ -117,8 +144,112 @@ def test_install_different_when_one_exists(self, tmp_path): finally: os.chdir(old_cwd) assert result.exit_code != 0 - assert "already installed" in result.output - assert "uninstall" in result.output + assert "Installed integrations: copilot" in result.output + assert "Default integration: copilot" in result.output + assert "--force" in result.output + + def test_install_multi_safe_integration(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + assert "installed successfully" in result.output + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "claude" + assert data["default_integration"] == "claude" + assert data["integration_state_schema"] == 1 + assert data["installed_integrations"] == ["claude", "codex"] + assert data["integration_settings"]["claude"]["invoke_separator"] == "-" + assert data["integration_settings"]["codex"]["invoke_separator"] == "-" + + assert (project / ".claude" / "skills" / "speckit-plan" / "SKILL.md").exists() + assert (project / ".agents" / "skills" / "speckit-plan" / "SKILL.md").exists() + + def test_install_additional_preserves_shared_manifest(self, tmp_path): + project = _init_project(tmp_path, "claude") + shared_manifest = project / ".specify" / "integrations" / "speckit.manifest.json" + before = set(json.loads(shared_manifest.read_text(encoding="utf-8"))["files"]) + assert before + + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + + after = set(json.loads(shared_manifest.read_text(encoding="utf-8"))["files"]) + assert before <= after + + def test_install_multi_safe_migrates_legacy_state(self, tmp_path): + project = _init_project(tmp_path, "claude") + int_json = project / ".specify" / "integration.json" + int_json.write_text(json.dumps({ + "integration": "claude", + "version": "0.0.0", + }), encoding="utf-8") + + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + + data = json.loads(int_json.read_text(encoding="utf-8")) + assert data["integration"] == "claude" + assert data["default_integration"] == "claude" + assert data["installed_integrations"] == ["claude", "codex"] + + def test_install_multi_unsafe_requires_force(self, tmp_path): + project = _init_project(tmp_path, "copilot") + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "install", "claude", + "--script", "sh", + ]) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + assert "Installed integrations: copilot" in result.output + assert "multi-install safe" in result.output + assert "--force" in result.output + + def test_install_multi_unsafe_allowed_with_force(self, tmp_path): + project = _init_project(tmp_path, "copilot") + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "install", "claude", + "--script", "sh", + "--force", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "copilot" + assert data["installed_integrations"] == ["copilot", "claude"] def test_install_into_bare_project(self, tmp_path): """Install into a project with .specify/ but no integration.""" @@ -250,7 +381,68 @@ def test_uninstall_wrong_key(self, tmp_path): finally: os.chdir(old_cwd) assert result.exit_code != 0 - assert "not the currently installed" in result.output + assert "not installed" in result.output + + def test_uninstall_invalid_manifest_reports_cli_error(self, tmp_path): + project = _init_project(tmp_path, "claude") + _write_invalid_manifest(project, "claude") + + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, ["integration", "uninstall", "claude"]) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + assert "manifest" in result.output + assert "unreadable" in result.output + + def test_uninstall_non_default_preserves_default(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + result = runner.invoke(app, [ + "integration", "uninstall", "codex", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + assert not (project / ".agents" / "skills" / "speckit-plan" / "SKILL.md").exists() + assert (project / ".claude" / "skills" / "speckit-plan" / "SKILL.md").exists() + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "claude" + assert data["installed_integrations"] == ["claude"] + + def test_uninstall_default_refreshes_templates_for_fallback(self, tmp_path): + project = _init_project(tmp_path, "gemini") + template = project / ".specify" / "templates" / "plan-template.md" + assert "/speckit.plan" in template.read_text(encoding="utf-8") + + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "claude", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + result = runner.invoke(app, ["integration", "uninstall", "gemini"], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "claude" + assert "/speckit-plan" in template.read_text(encoding="utf-8") def test_uninstall_preserves_shared_infra(self, tmp_path): """Shared scripts and templates are not removed by integration uninstall.""" @@ -271,6 +463,98 @@ def test_uninstall_preserves_shared_infra(self, tmp_path): assert (project / ".specify" / "templates").is_dir() +class TestIntegrationUse: + def test_use_installed_integration_sets_default(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + result = runner.invoke(app, ["integration", "use", "codex"], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "codex" + assert data["default_integration"] == "codex" + assert data["installed_integrations"] == ["claude", "codex"] + + opts = json.loads((project / ".specify" / "init-options.json").read_text(encoding="utf-8")) + assert opts["integration"] == "codex" + assert opts["ai"] == "codex" + + def test_use_requires_installed_integration(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, ["integration", "use", "codex"]) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + assert "not installed" in result.output + + def test_use_refreshes_shared_templates_between_command_styles(self, tmp_path): + project = _init_project(tmp_path, "claude") + template = project / ".specify" / "templates" / "plan-template.md" + assert "/speckit-plan" in template.read_text(encoding="utf-8") + + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "gemini", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + use_gemini = runner.invoke(app, ["integration", "use", "gemini"], catch_exceptions=False) + assert use_gemini.exit_code == 0, use_gemini.output + assert "/speckit.plan" in template.read_text(encoding="utf-8") + + use_claude = runner.invoke(app, ["integration", "use", "claude"], catch_exceptions=False) + assert use_claude.exit_code == 0, use_claude.output + assert "/speckit-plan" in template.read_text(encoding="utf-8") + finally: + os.chdir(old_cwd) + + def test_use_preserves_modified_templates_unless_forced(self, tmp_path): + project = _init_project(tmp_path, "claude") + template = project / ".specify" / "templates" / "plan-template.md" + template.write_text("custom template with /speckit-plan\n", encoding="utf-8") + + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "gemini", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + use_gemini = runner.invoke(app, ["integration", "use", "gemini"], catch_exceptions=False) + assert use_gemini.exit_code == 0, use_gemini.output + assert template.read_text(encoding="utf-8") == "custom template with /speckit-plan\n" + + force_use = runner.invoke(app, [ + "integration", "use", "gemini", + "--force", + ], catch_exceptions=False) + assert force_use.exit_code == 0, force_use.output + finally: + os.chdir(old_cwd) + + updated = template.read_text(encoding="utf-8") + assert "/speckit.plan" in updated + assert "custom template" not in updated + + # ── switch ─────────────────────────────────────────────────────────── @@ -296,6 +580,22 @@ def test_switch_unknown_target(self, tmp_path): assert result.exit_code != 0 assert "Unknown integration" in result.output + def test_switch_invalid_current_manifest_reports_cli_error(self, tmp_path): + project = _init_project(tmp_path, "claude") + _write_invalid_manifest(project, "claude") + + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "switch", "codex", + "--script", "sh", + ]) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + assert "Could not read integration manifest" in result.output + def test_switch_same_noop(self, tmp_path): project = _init_project(tmp_path, "copilot") old_cwd = os.getcwd() @@ -305,7 +605,48 @@ def test_switch_same_noop(self, tmp_path): finally: os.chdir(old_cwd) assert result.exit_code == 0 - assert "already installed" in result.output + assert "already the default integration" in result.output + + def test_switch_same_force_refreshes_shared_templates(self, tmp_path): + project = _init_project(tmp_path, "claude") + template = project / ".specify" / "templates" / "plan-template.md" + template.write_text("# custom shared template\n", encoding="utf-8") + + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, [ + "integration", "switch", "claude", + "--force", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + assert "managed shared templates refreshed" in result.output + assert "/speckit-plan" in template.read_text(encoding="utf-8") + + def test_switch_installed_target_rejects_integration_options(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + result = runner.invoke(app, [ + "integration", "switch", "codex", + "--integration-options", "--bogus", + ]) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + assert "--integration-options cannot be used" in result.output + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["default_integration"] == "claude" def test_switch_between_integrations(self, tmp_path): project = _init_project(tmp_path, "claude") @@ -376,6 +717,79 @@ def test_switch_from_nothing(self, tmp_path): data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) assert data["integration"] == "claude" + def test_failed_switch_keeps_fallback_metadata_consistent(self, tmp_path): + project = _init_project(tmp_path, "claude") + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "codex", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + result = runner.invoke(app, [ + "integration", "switch", "generic", + "--script", "sh", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "codex" + assert data["installed_integrations"] == ["codex"] + + opts = json.loads((project / ".specify" / "init-options.json").read_text(encoding="utf-8")) + assert opts["integration"] == "codex" + assert opts["ai"] == "codex" + + template = project / ".specify" / "templates" / "plan-template.md" + assert "/speckit-plan" in template.read_text(encoding="utf-8") + + +class TestIntegrationUpgrade: + def test_upgrade_invalid_manifest_reports_cli_error(self, tmp_path): + project = _init_project(tmp_path, "claude") + _write_invalid_manifest(project, "claude") + + old_cwd = os.getcwd() + try: + os.chdir(project) + result = runner.invoke(app, ["integration", "upgrade", "claude"]) + finally: + os.chdir(old_cwd) + assert result.exit_code != 0 + assert "manifest" in result.output + assert "unreadable" in result.output + + def test_upgrade_non_default_keeps_default_template_invocations(self, tmp_path): + project = _init_project(tmp_path, "gemini") + template = project / ".specify" / "templates" / "plan-template.md" + assert "/speckit.plan" in template.read_text(encoding="utf-8") + + old_cwd = os.getcwd() + try: + os.chdir(project) + install = runner.invoke(app, [ + "integration", "install", "claude", + "--script", "sh", + ], catch_exceptions=False) + assert install.exit_code == 0, install.output + + result = runner.invoke(app, [ + "integration", "upgrade", "claude", + "--script", "sh", + "--force", + ], catch_exceptions=False) + finally: + os.chdir(old_cwd) + assert result.exit_code == 0, result.output + + data = json.loads((project / ".specify" / "integration.json").read_text(encoding="utf-8")) + assert data["integration"] == "gemini" + assert "/speckit.plan" in template.read_text(encoding="utf-8") + # ── Full lifecycle ─────────────────────────────────────────────────── diff --git a/tests/integrations/test_registry.py b/tests/integrations/test_registry.py index 8ab1425148..1b36501056 100644 --- a/tests/integrations/test_registry.py +++ b/tests/integrations/test_registry.py @@ -1,7 +1,13 @@ """Tests for INTEGRATION_REGISTRY — mechanics, completeness, and registrar alignment.""" +import json +import os +from pathlib import PurePosixPath + import pytest +from typer.testing import CliRunner +from specify_cli import app from specify_cli.integrations import ( INTEGRATION_REGISTRY, _register, @@ -25,6 +31,72 @@ ] +def _multi_install_safe_keys() -> list[str]: + return sorted( + key + for key, integration in INTEGRATION_REGISTRY.items() + if integration.multi_install_safe + ) + + +def _multi_install_safe_pairs() -> list[tuple[str, str]]: + safe_keys = _multi_install_safe_keys() + return [ + (safe_keys[left], safe_keys[right]) + for left in range(len(safe_keys)) + for right in range(left + 1, len(safe_keys)) + ] + + +def _posix_path(value: str | None) -> str | None: + if not value: + return None + return PurePosixPath(value).as_posix() + + +def _integration_root_dir(key: str) -> str | None: + integration = INTEGRATION_REGISTRY[key] + cfg = integration.config if isinstance(integration.config, dict) else {} + return _posix_path(cfg.get("folder")) + + +def _integration_commands_dir(key: str) -> str | None: + integration = INTEGRATION_REGISTRY[key] + cfg = integration.config if isinstance(integration.config, dict) else {} + folder = cfg.get("folder") + if not folder: + return None + subdir = cfg.get("commands_subdir", "commands") + return (PurePosixPath(folder) / subdir).as_posix() + + +def _paths_overlap(first: str | None, second: str | None) -> bool: + if not first or not second: + return False + left = PurePosixPath(first) + right = PurePosixPath(second) + try: + left.relative_to(right) + return True + except ValueError: + pass + try: + right.relative_to(left) + return True + except ValueError: + return False + + +def _path_is_inside(path: str | None, directory: str | None) -> bool: + if not path or not directory: + return False + try: + PurePosixPath(path).relative_to(PurePosixPath(directory)) + return True + except ValueError: + return False + + class TestRegistry: def test_registry_is_dict(self): assert isinstance(INTEGRATION_REGISTRY, dict) @@ -85,3 +157,134 @@ def test_no_stale_cursor_shorthand(self): """The old 'cursor' shorthand must not appear in AGENT_CONFIGS.""" from specify_cli.agents import CommandRegistrar assert "cursor" not in CommandRegistrar.AGENT_CONFIGS + + +class TestMultiInstallSafeContracts: + """Declared safe integrations must stay isolated from each other.""" + + @pytest.mark.parametrize("key", _multi_install_safe_keys()) + def test_safe_integrations_have_static_isolated_paths(self, key): + integration = INTEGRATION_REGISTRY[key] + + assert _integration_root_dir(key), ( + f"{key} is declared multi-install safe but has no static root directory" + ) + assert _integration_commands_dir(key), ( + f"{key} is declared multi-install safe but has no static commands directory" + ) + assert integration.context_file, ( + f"{key} is declared multi-install safe but has no context file" + ) + + @pytest.mark.parametrize(("first", "second"), _multi_install_safe_pairs()) + def test_safe_integrations_have_distinct_agent_roots(self, first, second): + assert not _paths_overlap(_integration_root_dir(first), _integration_root_dir(second)), ( + f"{first} and {second} are declared multi-install safe but have " + f"overlapping agent roots {_integration_root_dir(first)!r} and " + f"{_integration_root_dir(second)!r}" + ) + + @pytest.mark.parametrize(("first", "second"), _multi_install_safe_pairs()) + def test_safe_integrations_have_distinct_command_dirs(self, first, second): + assert not _paths_overlap(_integration_commands_dir(first), _integration_commands_dir(second)), ( + f"{first} and {second} are declared multi-install safe but have " + f"overlapping command directories {_integration_commands_dir(first)!r} and " + f"{_integration_commands_dir(second)!r}" + ) + + @pytest.mark.parametrize(("first", "second"), _multi_install_safe_pairs()) + def test_safe_integrations_have_distinct_context_files(self, first, second): + first_context = _posix_path(INTEGRATION_REGISTRY[first].context_file) + second_context = _posix_path(INTEGRATION_REGISTRY[second].context_file) + + assert first_context != second_context, ( + f"{first} and {second} are declared multi-install safe but share " + f"context file {first_context!r}" + ) + + @pytest.mark.parametrize(("first", "second"), _multi_install_safe_pairs()) + def test_safe_context_files_do_not_overlap_other_agent_roots(self, first, second): + first_context = _posix_path(INTEGRATION_REGISTRY[first].context_file) + second_context = _posix_path(INTEGRATION_REGISTRY[second].context_file) + + assert not _path_is_inside(first_context, _integration_root_dir(second)), ( + f"{first} context file {first_context!r} lives under {second} " + f"agent root {_integration_root_dir(second)!r}" + ) + assert not _path_is_inside(second_context, _integration_root_dir(first)), ( + f"{second} context file {second_context!r} lives under {first} " + f"agent root {_integration_root_dir(first)!r}" + ) + + @pytest.mark.parametrize(("first", "second"), _multi_install_safe_pairs()) + def test_safe_context_files_do_not_overlap_other_command_dirs(self, first, second): + first_context = _posix_path(INTEGRATION_REGISTRY[first].context_file) + second_context = _posix_path(INTEGRATION_REGISTRY[second].context_file) + + assert not _path_is_inside(first_context, _integration_commands_dir(second)), ( + f"{first} context file {first_context!r} lives under {second} " + f"commands directory {_integration_commands_dir(second)!r}" + ) + assert not _path_is_inside(second_context, _integration_commands_dir(first)), ( + f"{second} context file {second_context!r} lives under {first} " + f"commands directory {_integration_commands_dir(first)!r}" + ) + + @pytest.mark.parametrize(("first", "second"), _multi_install_safe_pairs()) + def test_safe_integrations_have_disjoint_manifests( + self, + tmp_path, + first, + second, + ): + for initial, additional in ((first, second), (second, first)): + project_root = tmp_path / f"project-{initial}-{additional}" + project_root.mkdir() + runner = CliRunner() + + original_cwd = os.getcwd() + try: + os.chdir(project_root) + init_result = runner.invoke( + app, + [ + "init", + "--here", + "--integration", + initial, + "--script", + "sh", + "--no-git", + "--ignore-agent-tools", + ], + catch_exceptions=False, + ) + assert init_result.exit_code == 0, init_result.output + + install_result = runner.invoke( + app, + ["integration", "install", additional, "--script", "sh"], + catch_exceptions=False, + ) + assert install_result.exit_code == 0, install_result.output + finally: + os.chdir(original_cwd) + + initial_manifest = json.loads( + ( + project_root / ".specify" / "integrations" / f"{initial}.manifest.json" + ).read_text(encoding="utf-8") + ) + additional_manifest = json.loads( + ( + project_root / ".specify" / "integrations" / f"{additional}.manifest.json" + ).read_text(encoding="utf-8") + ) + + initial_files = set(initial_manifest.get("files", {})) + additional_files = set(additional_manifest.get("files", {})) + + assert initial_files.isdisjoint(additional_files), ( + f"{initial} and {additional} are declared multi-install safe but both manage " + f"these files: {sorted(initial_files & additional_files)}" + ) diff --git a/tests/test_presets.py b/tests/test_presets.py index ee4a6dddb1..d6cfcb8bdf 100644 --- a/tests/test_presets.py +++ b/tests/test_presets.py @@ -14,6 +14,7 @@ import json import tempfile import shutil +import warnings import zipfile from pathlib import Path from datetime import datetime, timezone @@ -1889,6 +1890,10 @@ def test_url_cache_expired(self, project_dir): SELF_TEST_PRESET_DIR = Path(__file__).parent.parent / "presets" / "self-test" +SELF_TEST_WRAP_WARNING = ( + r"Cannot compose command 'speckit\.wrap-test': no base layer\. " + r"Stale command files may remain\." +) CORE_TEMPLATE_NAMES = [ "spec-template", @@ -1899,6 +1904,18 @@ def test_url_cache_expired(self, project_dir): ] +def install_self_test_preset(manager: PresetManager, speckit_version: str = "0.1.5") -> PresetManifest: + """Install self-test while filtering its intentionally missing wrap base.""" + with warnings.catch_warnings(): + warnings.filterwarnings( + "ignore", + message=SELF_TEST_WRAP_WARNING, + category=UserWarning, + module=r"specify_cli\.presets", + ) + return manager.install_from_directory(SELF_TEST_PRESET_DIR, speckit_version) + + class TestSelfTestPreset: """Tests using the self-test preset that ships with the repo.""" @@ -1939,7 +1956,7 @@ def test_self_test_templates_have_marker(self): def test_install_self_test_preset(self, project_dir): """Test installing the self-test preset from its directory.""" manager = PresetManager(project_dir) - manifest = manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + manifest = install_self_test_preset(manager) assert manifest.id == "self-test" assert manager.registry.is_installed("self-test") @@ -1952,7 +1969,7 @@ def test_self_test_overrides_all_core_templates(self, project_dir): # Install self-test preset manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) # Every core template should now resolve from the preset resolver = PresetResolver(project_dir) @@ -1971,7 +1988,7 @@ def test_self_test_resolve_with_source(self, project_dir): (templates_dir / f"{name}.md").write_text(f"# Core {name}\n") manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) resolver = PresetResolver(project_dir) for name in CORE_TEMPLATE_NAMES: @@ -1988,7 +2005,7 @@ def test_self_test_removal_restores_core(self, project_dir): (templates_dir / f"{name}.md").write_text(f"# Core {name}\n") manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) manager.remove("self-test") resolver = PresetResolver(project_dir) @@ -2024,7 +2041,7 @@ def test_self_test_registers_commands_for_claude(self, project_dir): claude_dir.mkdir(parents=True) manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) # Check the skill was registered cmd_file = claude_dir / "speckit-specify" / "SKILL.md" @@ -2040,7 +2057,7 @@ def test_self_test_registers_commands_for_gemini(self, project_dir): gemini_dir.mkdir(parents=True) manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) # Check the command was registered in TOML format cmd_file = gemini_dir / "speckit.specify.toml" @@ -2055,7 +2072,7 @@ def test_self_test_unregisters_commands_on_remove(self, project_dir): claude_dir.mkdir(parents=True) manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) cmd_file = claude_dir / "speckit-specify" / "SKILL.md" assert cmd_file.exists() @@ -2066,7 +2083,7 @@ def test_self_test_unregisters_commands_on_remove(self, project_dir): def test_self_test_no_commands_without_agent_dirs(self, project_dir): """Test that no commands are registered when no agent dirs exist.""" manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) metadata = manager.registry.get("self-test") assert metadata["registered_commands"] == {} @@ -2215,8 +2232,7 @@ def test_skill_overridden_on_preset_install(self, project_dir, temp_dir): # Install self-test preset (has a command override for speckit.specify) manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) skill_file = skills_dir / "speckit-specify" / "SKILL.md" assert skill_file.exists() @@ -2235,8 +2251,7 @@ def test_skill_not_updated_when_ai_skills_disabled(self, project_dir, temp_dir): self._create_skill(skills_dir, "speckit-specify", body="untouched") manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) skill_file = skills_dir / "speckit-specify" / "SKILL.md" content = skill_file.read_text() @@ -2268,8 +2283,7 @@ def test_skill_not_updated_without_init_options(self, project_dir, temp_dir): self._create_skill(skills_dir, "speckit-specify", body="untouched") manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) skill_file = skills_dir / "speckit-specify" / "SKILL.md" file_content = skill_file.read_text() @@ -2289,8 +2303,7 @@ def test_skill_restored_on_preset_remove(self, project_dir, temp_dir): (core_cmds / "specify.md").write_text("---\ndescription: Core specify command\n---\n\nCore specify body\n") manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) # Verify preset content is in the skill skill_file = skills_dir / "speckit-specify" / "SKILL.md" @@ -2326,8 +2339,7 @@ def test_skill_restored_on_remove_resolves_script_placeholders(self, project_dir ) manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) manager.remove("self-test") content = (skills_dir / "speckit-specify" / "SKILL.md").read_text() @@ -2343,8 +2355,7 @@ def test_skill_not_overridden_when_skill_path_is_file(self, project_dir): (skills_dir / "speckit-specify").write_text("not-a-directory") manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) assert (skills_dir / "speckit-specify").is_file() metadata = manager.registry.get("self-test") @@ -2356,8 +2367,7 @@ def test_no_skills_registered_when_no_skill_dir_exists(self, project_dir, temp_d # Don't create skills dir — simulate --ai-skills never created them manager = PresetManager(project_dir) - SELF_TEST_DIR = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(SELF_TEST_DIR, "0.1.5") + install_self_test_preset(manager) metadata = manager.registry.get("self-test") assert metadata.get("registered_skills", []) == [] @@ -2558,8 +2568,7 @@ def test_kimi_legacy_dotted_skill_override_still_applies(self, project_dir, temp (project_dir / ".kimi" / "commands").mkdir(parents=True, exist_ok=True) manager = PresetManager(project_dir) - self_test_dir = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(self_test_dir, "0.1.5") + install_self_test_preset(manager) skill_file = skills_dir / "speckit.specify" / "SKILL.md" assert skill_file.exists() @@ -2579,8 +2588,7 @@ def test_kimi_skill_updated_even_when_ai_skills_disabled(self, project_dir, temp (project_dir / ".kimi" / "commands").mkdir(parents=True, exist_ok=True) manager = PresetManager(project_dir) - self_test_dir = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(self_test_dir, "0.1.5") + install_self_test_preset(manager) skill_file = skills_dir / "speckit-specify" / "SKILL.md" assert skill_file.exists() @@ -2759,8 +2767,7 @@ def test_preset_skill_registration_handles_non_dict_init_options(self, project_d self._create_skill(skills_dir, "speckit-specify", body="untouched") manager = PresetManager(project_dir) - self_test_dir = Path(__file__).parent.parent / "presets" / "self-test" - manager.install_from_directory(self_test_dir, "0.1.5") + install_self_test_preset(manager) skill_content = (skills_dir / "speckit-specify" / "SKILL.md").read_text() assert "untouched" in skill_content @@ -3419,7 +3426,7 @@ def test_end_to_end_wrap_via_self_test_preset(self, project_dir): ) manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) written = (skill_subdir / "SKILL.md").read_text() assert "{CORE_TEMPLATE}" not in written @@ -3471,7 +3478,7 @@ def test_register_skills_inherits_scripts_from_core_when_preset_omits_them(self, ) manager = PresetManager(project_dir) - manager.install_from_directory(SELF_TEST_PRESET_DIR, "0.1.5") + install_self_test_preset(manager) written = (skill_subdir / "SKILL.md").read_text() # {SCRIPT} should have been resolved (not left as a literal placeholder)