Skip to content

Add native LLM request adapter#26941

Draft
kitlangton wants to merge 3 commits into
llm-native-event-adapterfrom
llm-native-request-adapter
Draft

Add native LLM request adapter#26941
kitlangton wants to merge 3 commits into
llm-native-event-adapterfrom
llm-native-request-adapter

Conversation

@kitlangton
Copy link
Copy Markdown
Contributor

@kitlangton kitlangton commented May 11, 2026

Summary

  • add an offline adapter that converts normalized session LLM inputs into native @opencode-ai/llm requests
  • map provider package metadata to native route ids, model refs, messages, tools, generation options, provider options, and headers
  • cover the adapter with focused request-shape tests without changing the live AI SDK stream path

Test Plan

  • bun test --timeout 5000 test/session/llm-native.test.ts
  • bun run test -- test/session/llm-native.test.ts
  • bun typecheck
  • bunx prettier --write src/session/llm-native.ts test/session/llm-native.test.ts
  • bunx oxlint packages/opencode/src/session/llm-native.ts packages/opencode/test/session/llm-native.test.ts from repo root
  • git diff --check

Stack

  1. Consume native LLM events in session processing #26639
  2. Add native LLM request adapter #26941 👈 current
  3. Compile native LLM requests in session tests #26946
  4. Add native OpenAI runtime opt-in #26947

@kitlangton kitlangton force-pushed the llm-native-event-adapter branch from 1ec6529 to 48bce2b Compare May 12, 2026 01:32
@kitlangton kitlangton force-pushed the llm-native-request-adapter branch from ec718a5 to 93e1043 Compare May 12, 2026 01:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant