Skip to content

feat(#31): lend harness provider connection to akm via LLM proxy shim#52

Draft
Copilot wants to merge 3 commits into
mainfrom
copilot/lend-provider-connection-to-akm
Draft

feat(#31): lend harness provider connection to akm via LLM proxy shim#52
Copilot wants to merge 3 commits into
mainfrom
copilot/lend-provider-connection-to-akm

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented May 4, 2026

When akm.llm is not configured, akm index inference and graph passes silently skip. This PR wires both plugins to detect that state and install a stdio-contract proxy shim so akm can borrow the harness's existing provider connection for those passes.

Contract

Shim reads JSON {"prompt":"...","system":"...","model":"..."} from stdin, writes completion text to stdout, exits 0 on success. Non-zero exit degrades the index pass to no-op (never fatal).

OpenCode plugin

  • hasAkmLlmConfigured(cache?) — calls akm config get llm, treats empty/null/{} as unconfigured. Per-instance cache (closure) avoids repeated CLI calls in shell.env without leaking state across test runs.
  • installLlmProxyShim(stateDir) — writes the shim to $AKM_PLUGIN_STATE_DIR (falls back to XDG_STATE_HOME); catches all errors and returns null (non-fatal).
  • shell.env now sets AKM_LLM_PROXY_CMD when unconfigured, plus AKM_LLM_PROXY_MODEL sourced from the session model captured in experimental.chat.system.transform.
  • Shim supports Anthropic and OpenAI REST APIs; picks provider from env keys already present.

Claude plugin

  • has_akm_llm_configured() / install_llm_proxy_shim() shell functions added.
  • session_start() calls them when akm.llm is unset; exports AKM_LLM_PROXY_CMD for the hook process lifetime.
  • Claude shim tries claude -p first (reuses session auth) before falling back to REST API keys.

Security

Provider credentials are never written to disk — the shim inherits ANTHROPIC_API_KEY / OPENAI_API_KEY from the harness process environment.

Pending

AKM CLI must still land AKM_LLM_PROXY_CMD support in resolveIndexPassLLM() (tracked in itlackey/akm). This PR covers the full plugin side so the integration is ready to wire up once the CLI contract ships.

Copilot AI and others added 2 commits May 4, 2026 06:31
- OpenCode plugin: add hasAkmLlmConfigured(), installLlmProxyShim(),
  buildLlmProxyShimContent(); update shell.env to detect missing akm.llm
  and set AKM_LLM_PROXY_CMD pointing to the written shim
- OpenCode plugin: capture session model in experimental.chat.system.transform
  and pass AKM_LLM_PROXY_MODEL hint to the shim via shell.env
- Claude plugin: add has_akm_llm_configured(), install_llm_proxy_shim();
  update session_start() to install the shim and export AKM_LLM_PROXY_CMD
- Shim supports Anthropic and OpenAI REST APIs; Claude plugin shim prefers
  the claude CLI when available; credentials stay in env, never written to disk
- Add 8 new tests covering both plugins (5 OpenCode, 3 Claude)

Agent-Logs-Url: https://github.com/itlackey/akm-plugins/sessions/4fa609f1-291b-4d18-a289-0b15511e910d

Co-authored-by: itlackey <6414031+itlackey@users.noreply.github.com>
Copilot AI changed the title [WIP] Lend the harness's provider connection to akm when no akm.llm is configured feat(#31): lend harness provider connection to akm via LLM proxy shim May 4, 2026
Copilot AI requested a review from itlackey May 4, 2026 06:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Lend the harness's provider connection to akm when no akm.llm is configured

2 participants