Skip to content

feat(ai-client): support TanStack Start server functions in stream() adapter#508

Draft
tombeckenham wants to merge 11 commits intoTanStack:mainfrom
tombeckenham:claude/add-usechat-server-functions-pBsj5
Draft

feat(ai-client): support TanStack Start server functions in stream() adapter#508
tombeckenham wants to merge 11 commits intoTanStack:mainfrom
tombeckenham:claude/add-usechat-server-functions-pBsj5

Conversation

@tombeckenham
Copy link
Copy Markdown
Contributor

@tombeckenham tombeckenham commented Apr 27, 2026

🎯 Changes

Closes #509.

Wiring useChat to a TanStack Start server function currently fails to typecheck because stream()'s factory is typed as () => AsyncIterable<StreamChunk>, but a server function call returns Promise<AsyncIterable<StreamChunk>> (handler returns the chat stream) or Promise<Response> (handler returns toServerSentEventsResponse(stream)):

Property '[Symbol.asyncIterator]' is missing in type 'Promise' but required in type 'AsyncIterable'. ts(2741)

This PR widens stream() (and rpcStream()) to accept all three shapes — awaiting the result and parsing SSE if a Response is returned — so a server function can be wired directly into useChat with end-to-end type safety from the call site to the handler:

const chatFn = createServerFn({ method: 'POST' })
  .inputValidator((data: { messages: Array<UIMessage> }) => data)
  .handler(({ data }) =>
    toServerSentEventsResponse(chat({ adapter, messages: data.messages })),
  )

useChat({ connection: stream((messages) => chatFn({ data: { messages } })) })

Reported by @vfshera in Discord.

Files touched

  • packages/typescript/ai-client/src/connection-adapters.ts — extend stream() and rpcStream(); factor shared SSE-from-Response helper.
  • packages/typescript/ai-client/src/index.ts — export new StreamFactoryResult type.
  • packages/typescript/ai-client/tests/connection-adapters.test.ts — 4 new tests (Promise, Promise, Response error, rpcStream Promise).
  • docs/chat/connection-adapters.md — new "TanStack Start Server Functions" section.
  • examples/ts-react-chat — new /server-fn-chat route + chatFn server function demonstrating the pattern.
  • .changeset/stream-adapter-server-functions.md@tanstack/ai-client minor.

✅ Checklist

  • I have followed the steps in the Contributing guide.
  • I have tested this code locally with pnpm run test:pr (test:lib, test:types, test:eslint all pass; manual /server-fn-chat smoke-test was not run — sandbox couldn't launch a browser).

🚀 Release Impact

  • This change affects published code, and I have generated a changeset (@tanstack/ai-client minor).
  • This change is docs/CI/dev-only (no release).

cc @vfshera

claude added 2 commits April 26, 2026 21:09
…adapter

The stream() factory now accepts Promise<AsyncIterable<StreamChunk>> and
Promise<Response> in addition to the existing AsyncIterable shape, so a
TanStack Start server function (which is just an async API endpoint) can
be wired directly into useChat without a route handler:

  useChat({
    connection: stream((messages) => chatFn({ data: { messages } })),
  })

When the factory returns a Response (e.g. via toServerSentEventsResponse),
the adapter parses the SSE body into chunks. rpcStream() likewise accepts
a Promise-returning RPC call.

Adds unit tests for both new shapes and a docs section in
chat/connection-adapters.md.
Adds a working /server-fn-chat route that wires useChat to a TanStack
Start server function via the stream() connection adapter:

  useChat({
    connection: stream((messages) =>
      chatFn({ data: { messages: messages as UIMessage[] } }),
    ),
  })

The new chatFn handler in lib/server-fns.ts returns
toServerSentEventsResponse(chat({ ... })) — the stream() adapter awaits
the server function and parses the SSE response into chunks.

Sits alongside the existing index.tsx pattern (fetchServerSentEvents
to a route handler) so users can compare the two invocation styles.
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Apr 27, 2026

Important

Review skipped

Draft detected.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository. To trigger a single review, invoke the @coderabbitai review command.

⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 2b11b7ce-78c8-4de3-9f0b-a51893b6c639

You can disable this status message by setting the reviews.review_status to false in the CodeRabbit configuration file.

Use the checkbox below for a quick retry:

  • 🔍 Trigger review
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

tombeckenham and others added 3 commits April 27, 2026 18:05
Lead with what stream() does (typed RPC into useChat), instead of
calling a server function "just a fancy/async API endpoint." Same
edits applied to the changeset and the stream() JSDoc.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…server-functions-pBsj5

# Conflicts:
#	examples/ts-react-chat/src/components/Header.tsx
#	examples/ts-react-chat/src/lib/server-fns.ts
#	packages/typescript/ai-client/src/connection-adapters.ts
@nx-cloud
Copy link
Copy Markdown

nx-cloud Bot commented Apr 27, 2026

View your CI Pipeline Execution ↗ for commit cb39b84

Command Status Duration Result
nx run-many --targets=build --exclude=examples/** ✅ Succeeded 1m 8s View ↗

☁️ Nx Cloud last updated this comment at 2026-04-28 00:59:24 UTC

The previous merge commit accidentally included stale references in
@tanstack/ai-fal that this PR shouldn't have touched.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@pkg-pr-new
Copy link
Copy Markdown

pkg-pr-new Bot commented Apr 27, 2026

Open in StackBlitz

@tanstack/ai

npm i https://pkg.pr.new/@tanstack/ai@508

@tanstack/ai-anthropic

npm i https://pkg.pr.new/@tanstack/ai-anthropic@508

@tanstack/ai-client

npm i https://pkg.pr.new/@tanstack/ai-client@508

@tanstack/ai-code-mode

npm i https://pkg.pr.new/@tanstack/ai-code-mode@508

@tanstack/ai-code-mode-skills

npm i https://pkg.pr.new/@tanstack/ai-code-mode-skills@508

@tanstack/ai-devtools-core

npm i https://pkg.pr.new/@tanstack/ai-devtools-core@508

@tanstack/ai-elevenlabs

npm i https://pkg.pr.new/@tanstack/ai-elevenlabs@508

@tanstack/ai-event-client

npm i https://pkg.pr.new/@tanstack/ai-event-client@508

@tanstack/ai-fal

npm i https://pkg.pr.new/@tanstack/ai-fal@508

@tanstack/ai-gemini

npm i https://pkg.pr.new/@tanstack/ai-gemini@508

@tanstack/ai-grok

npm i https://pkg.pr.new/@tanstack/ai-grok@508

@tanstack/ai-groq

npm i https://pkg.pr.new/@tanstack/ai-groq@508

@tanstack/ai-isolate-cloudflare

npm i https://pkg.pr.new/@tanstack/ai-isolate-cloudflare@508

@tanstack/ai-isolate-node

npm i https://pkg.pr.new/@tanstack/ai-isolate-node@508

@tanstack/ai-isolate-quickjs

npm i https://pkg.pr.new/@tanstack/ai-isolate-quickjs@508

@tanstack/ai-ollama

npm i https://pkg.pr.new/@tanstack/ai-ollama@508

@tanstack/ai-openai

npm i https://pkg.pr.new/@tanstack/ai-openai@508

@tanstack/ai-openrouter

npm i https://pkg.pr.new/@tanstack/ai-openrouter@508

@tanstack/ai-preact

npm i https://pkg.pr.new/@tanstack/ai-preact@508

@tanstack/ai-react

npm i https://pkg.pr.new/@tanstack/ai-react@508

@tanstack/ai-react-ui

npm i https://pkg.pr.new/@tanstack/ai-react-ui@508

@tanstack/ai-solid

npm i https://pkg.pr.new/@tanstack/ai-solid@508

@tanstack/ai-solid-ui

npm i https://pkg.pr.new/@tanstack/ai-solid-ui@508

@tanstack/ai-svelte

npm i https://pkg.pr.new/@tanstack/ai-svelte@508

@tanstack/ai-vue

npm i https://pkg.pr.new/@tanstack/ai-vue@508

@tanstack/ai-vue-ui

npm i https://pkg.pr.new/@tanstack/ai-vue-ui@508

@tanstack/preact-ai-devtools

npm i https://pkg.pr.new/@tanstack/preact-ai-devtools@508

@tanstack/react-ai-devtools

npm i https://pkg.pr.new/@tanstack/react-ai-devtools@508

@tanstack/solid-ai-devtools

npm i https://pkg.pr.new/@tanstack/solid-ai-devtools@508

commit: e773e88

tombeckenham and others added 5 commits April 27, 2026 19:09
Type-check was failing on CI because the new server-function and RPC
async-iterable test fixtures yielded raw string literals for chunk
type, which don't satisfy the EventType enum required by StreamChunk.
Switch to the enum and add the required threadId to RUN_FINISHED.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…abortSignal

Address review findings on the server-function stream() adapter PR:

- Synthesized RUN_FINISHED/RUN_ERROR events in normalizeConnectionAdapter
  no longer use `as unknown as StreamChunk`. Track threadId/runId from
  upstream chunks during iteration and reuse them; fall back to synthesized
  IDs only when no upstream chunk carried them. Use the EventType enum and
  typed RunFinishedEvent/RunErrorEvent so missing required fields are caught
  by the compiler instead of papered over.
- Stop swallowing JSON.parse failures in parseSSEChunks and fetchHttpStream.
  A malformed mid-stream chunk is a protocol error; let it throw so it
  surfaces as RUN_ERROR via the connect-wrapper's catch path instead of
  silently dropping data behind a console.warn the user never sees.
- Widen stream() and rpcStream() factory signatures with an optional
  abortSignal third arg and pass it through. Backwards-compatible — callers
  that ignore the third parameter are unaffected. Lets long-running server
  functions cancel in-flight work when useChat aborts.

Tests updated to assert SyntaxError propagation rather than silent dropping
on malformed JSON, and to expect the new third call argument on factory
mocks.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…railing buffer

Refine the SSE parser so the throw-on-parse-failure behavior doesn't regress
on legitimate SSE traffic:

- parseSSEChunks now skips SSE comment lines (`:`) and non-data fields
  (`event:`, `id:`, `retry:`) which proxies and CDNs commonly inject as
  keepalives. Previously these would have flowed into JSON.parse and
  thrown, killing otherwise-healthy streams behind any infrastructure
  that injects SSE control lines.
- readStreamLines no longer yields the unterminated trailing buffer at
  stream end. A non-empty buffer means the connection was cut mid-line,
  so the content is partial by definition — yielding it would feed
  truncated JSON to the parser and surface a misleading RUN_ERROR for
  what is really a transport-layer issue. Warn and discard instead.

Bare-line JSON (legacy/raw mode) is still accepted to preserve the
existing public behavior covered by the `should handle SSE format
without data: prefix` test.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…t UIMessage[] in chat()

Drops the `as any` / `as Array<UIMessage>` casts previously needed when wiring
useChat through a TanStack Start server function into chat(). The stream()
factory now declares Array<UIMessage> (with a runtime assert matching the
ChatClient invariant), and chat()'s messages option accepts UIMessage[]
directly alongside ConstrainedModelMessage[] — the runtime already normalised
both via convertMessagesToModelMessages.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Pulls the @tanstack/ai changes back out — the chat() messages-option
widening to accept UIMessage[] is a separate concern from the stream()
server-function feature this PR is about. Restores the example's `as any`
cast with a comment, drops the @tanstack/ai minor bump from the changeset,
and reverts chat/index.ts to its pre-PR state. Also bumps the example
adapter to gpt-5.2.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
tombeckenham added a commit to tombeckenham/ai-tom that referenced this pull request Apr 28, 2026
…ranch

The fetcher path uses the same SSE parsing and connect-wrapper plumbing
as the stream() path on TanStack#508, so the polish that landed during TanStack#508's
review applies directly here. Carry it over so this branch has the same
robustness.

- Skip SSE control lines (`:` comments, `event:` / `id:` / `retry:`) in
  responseToSSEChunks. Proxies and CDNs inject these as keepalives;
  letting them through would feed JSON.parse a non-payload line.
- Drop unterminated trailing buffer in readStreamLines. A non-empty
  buffer at stream end means the connection was cut mid-line, so the
  data is partial — yielding it would surface a misleading RUN_ERROR
  for what is really a transport-layer issue.
- Surface JSON.parse failures in responseToSSEChunks and fetchHttpStream.
  Stop swallowing them behind console.warn; let SyntaxError propagate so
  the connect-wrapper turns it into a visible RUN_ERROR.
- Drop unsafe `as unknown as StreamChunk` casts in
  normalizeConnectionAdapter's synthesized RUN_FINISHED / RUN_ERROR
  events. Use EventType + RunFinishedEvent / RunErrorEvent so missing
  required fields are caught by the compiler. Track upstream
  threadId/runId from chunks and reuse them in the synthesis instead of
  fabricating both ids unconditionally.
- Forward optional abortSignal third arg through stream() and rpcStream()
  factory signatures. Backwards-compatible for existing callers; lets
  long-running factories cancel when useChat aborts. Mirrors what
  fetcherToConnectionAdapter already does.

Tests:
- Update the two `should handle malformed JSON gracefully` tests to
  assert SyntaxError throws instead of silent drop.
- Update stream() / rpcStream() factory mock assertions to expect the new
  third arg.
- Add chat-fetcher test asserting a fetcher returning a malformed-SSE
  Response surfaces as a RUN_ERROR via onError.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(ai-client): useChat server functions

2 participants