Skip to content

Define explicit LLM stream event seam#26626

Closed
kitlangton wants to merge 1 commit into
devfrom
llm-service-event-seam
Closed

Define explicit LLM stream event seam#26626
kitlangton wants to merge 1 commit into
devfrom
llm-service-event-seam

Conversation

@kitlangton
Copy link
Copy Markdown
Contributor

Summary

  • Defines an opencode-owned LLM.Event union for the existing LLM.Service.stream seam.
  • Keeps the current AI SDK runtime path intact, with a single cast at the adapter boundary.
  • Adds a processor test that consumes explicit LLM.Event values directly.

Tests

  • cd packages/opencode && bun typecheck
  • cd packages/opencode && bun run test test/session/processor-effect.test.ts
  • cd packages/opencode && bun run test test/session/message-v2.test.ts
  • cd packages/opencode && bun run test test/session/retry.test.ts
  • cd packages/opencode && bun run test test/session/prompt.test.ts
  • cd packages/opencode && bun run test test/provider/transform.test.ts
  • git diff --check

Notes

  • No native @opencode-ai/llm runtime integration yet.
  • No AI SDK dependency removal in this slice.
  • test/session/compaction.test.ts is still baseline-failing on clean origin/dev with AsyncFiberError: An asynchronous Effect was executed with Effect.runSync.

@kitlangton
Copy link
Copy Markdown
Contributor Author

Closing in favor of #26639, which tests the migration path using @opencode-ai/llm LLMEvent as the session stream seam instead of preserving an AI SDK-shaped wrapper type.

@kitlangton kitlangton closed this May 10, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant