Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions .changeset/stream-adapter-server-functions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
---
'@tanstack/ai-client': minor
---

feat(ai-client): support TanStack Start server functions in `stream()` connection adapter

The `stream()` factory now accepts any of three return shapes, so a TanStack Start server function can be wired directly into `useChat`:

- `AsyncIterable<StreamChunk>` β€” direct in-process stream (existing behavior)
- `Promise<AsyncIterable<StreamChunk>>` β€” server function returning the chat stream
- `Promise<Response>` β€” server function returning `toServerSentEventsResponse(stream)`

`rpcStream()` likewise accepts a `Promise<AsyncIterable<StreamChunk>>`.

```ts
const chatFn = createServerFn({ method: 'POST' })
.inputValidator((data: { messages: Array<UIMessage> }) => data)
.handler(({ data }) =>
toServerSentEventsResponse(chat({ adapter, messages: data.messages })),
)

useChat({ connection: stream((messages) => chatFn({ data: { messages } })) })
```

The `stream()` callback's `messages` parameter is now typed as `Array<UIMessage>` (was `Array<UIMessage> | Array<ModelMessage>`) β€” matching what `useChat`/`ChatClient` actually sends. A runtime assertion guards against misuse. Existing callbacks typed against the union remain assignable (wider declared input satisfies narrower expected input).
56 changes: 56 additions & 0 deletions docs/chat/connection-adapters.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,62 @@ const { messages } = useChat({
});
```

### TanStack Start Server Functions

`stream()` adapts a TanStack Start server function into a `useChat` connection so you get end-to-end type safety from the call site to the handler. The factory you pass to `stream()` may return either the chat `AsyncIterable` directly, or an SSE `Response` produced by `toServerSentEventsResponse()` β€” `stream()` awaits the result and unwraps a `Response` if it sees one.

#### Returning an SSE Response (recommended)

Wrap the chat stream in `toServerSentEventsResponse()` so only encoded bytes flow over the wire. The client parses the SSE automatically:

```typescript
// server-fns.ts
import { createServerFn } from "@tanstack/react-start";
import { chat, toServerSentEventsResponse } from "@tanstack/ai";
import { openaiText } from "@tanstack/ai-openai";
import type { UIMessage } from "@tanstack/ai";

export const chatFn = createServerFn({ method: "POST" })
.inputValidator((data: { messages: Array<UIMessage> }) => data)
.handler(({ data }) =>
toServerSentEventsResponse(
chat({
adapter: openaiText("gpt-4o"),
messages: data.messages,
}),
),
);
```

```tsx
// client
import { useChat, stream } from "@tanstack/ai-react";
import { chatFn } from "./server-fns";

const { messages, sendMessage } = useChat({
connection: stream((messages) => chatFn({ data: { messages } })),
});
```

#### Returning the AsyncIterable directly

If you don't want to encode an HTTP response, return the chat stream itself. `stream()` awaits the server function and yields chunks straight through:

```typescript
// server-fns.ts
export const chatFn = createServerFn({ method: "POST" })
.inputValidator((data: { messages: Array<UIMessage> }) => data)
.handler(({ data }) =>
chat({ adapter: openaiText("gpt-4o"), messages: data.messages }),
);
```

```tsx
const { messages, sendMessage } = useChat({
connection: stream((messages) => chatFn({ data: { messages } })),
});
```

## Custom Adapters

For specialized use cases, you can create custom adapters to meet specific protocols or requirements:
Expand Down
14 changes: 14 additions & 0 deletions examples/ts-react-chat/src/components/Header.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ import {
Menu,
Mic,
Music,
Server,
Video,
X,
} from 'lucide-react'
Expand Down Expand Up @@ -197,6 +198,19 @@ export default function Header() {
<Mic size={20} />
<span className="font-medium">Voice Chat (Realtime)</span>
</Link>

<Link
to="/server-fn-chat"
onClick={() => setIsOpen(false)}
className="flex items-center gap-3 p-3 rounded-lg hover:bg-gray-800 transition-colors mb-2"
activeProps={{
className:
'flex items-center gap-3 p-3 rounded-lg bg-cyan-600 hover:bg-cyan-700 transition-colors mb-2',
}}
>
<Server size={20} />
<span className="font-medium">Server Function Chat</span>
</Link>
</nav>
</aside>
</>
Expand Down
33 changes: 32 additions & 1 deletion examples/ts-react-chat/src/lib/server-fns.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { createServerFn } from '@tanstack/react-start'
import { z } from 'zod'
import {
chat,
generateAudio,
generateImage,
generateSpeech,
Expand All @@ -10,14 +11,20 @@ import {
summarize,
toServerSentEventsResponse,
} from '@tanstack/ai'
import { openaiImage, openaiSummarize, openaiVideo } from '@tanstack/ai-openai'
import {
openaiImage,
openaiSummarize,
openaiText,
openaiVideo,
} from '@tanstack/ai-openai'
import {
InvalidModelOverrideError,
UnknownProviderError,
buildAudioAdapter,
buildSpeechAdapter,
buildTranscriptionAdapter,
} from './server-audio-adapters'
import type { UIMessage } from '@tanstack/ai'

/**
* Server-fn error with a stable `code` property clients can switch on.
Expand Down Expand Up @@ -365,3 +372,27 @@ export const generateVideoStreamFn = createServerFn({ method: 'POST' })
}),
)
})

// =============================================================================
// Chat server function (streams via SSE Response)
// Used with: stream((messages) => chatFn({ data: { messages } }))
// =============================================================================

export const chatFn = createServerFn({ method: 'POST' })
.inputValidator(
(data: { messages: Array<UIMessage>; data?: Record<string, any> }) => data,
)
.handler(({ data }) =>
toServerSentEventsResponse(
chat({
adapter: openaiText('gpt-5.2'),
// chat()'s messages option is typed as ConstrainedModelMessage[], but the
// runtime accepts UIMessage[] too (normalised via convertMessagesToModelMessages).
// Cast to bridge the gap until the public type is widened in a separate PR.
messages: data.messages as any,
systemPrompts: [
'You are a helpful assistant. Keep replies short and friendly.',
],
}),
),
)
21 changes: 21 additions & 0 deletions examples/ts-react-chat/src/routeTree.gen.ts
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
// Additionally, you should also exclude this file from your linter and/or formatter to prevent it from being checked or modified.

import { Route as rootRouteImport } from './routes/__root'
import { Route as ServerFnChatRouteImport } from './routes/server-fn-chat'
import { Route as RealtimeRouteImport } from './routes/realtime'
import { Route as ImageGenRouteImport } from './routes/image-gen'
import { Route as IndexRouteImport } from './routes/index'
Expand All @@ -31,6 +32,11 @@ import { Route as ApiGenerateSpeechRouteImport } from './routes/api.generate.spe
import { Route as ApiGenerateImageRouteImport } from './routes/api.generate.image'
import { Route as ApiGenerateAudioRouteImport } from './routes/api.generate.audio'

const ServerFnChatRoute = ServerFnChatRouteImport.update({
id: '/server-fn-chat',
path: '/server-fn-chat',
getParentRoute: () => rootRouteImport,
} as any)
const RealtimeRoute = RealtimeRouteImport.update({
id: '/realtime',
path: '/realtime',
Expand Down Expand Up @@ -143,6 +149,7 @@ export interface FileRoutesByFullPath {
'/': typeof IndexRoute
'/image-gen': typeof ImageGenRoute
'/realtime': typeof RealtimeRoute
'/server-fn-chat': typeof ServerFnChatRoute
'/api/image-gen': typeof ApiImageGenRoute
'/api/structured-output': typeof ApiStructuredOutputRoute
'/api/summarize': typeof ApiSummarizeRoute
Expand All @@ -166,6 +173,7 @@ export interface FileRoutesByTo {
'/': typeof IndexRoute
'/image-gen': typeof ImageGenRoute
'/realtime': typeof RealtimeRoute
'/server-fn-chat': typeof ServerFnChatRoute
'/api/image-gen': typeof ApiImageGenRoute
'/api/structured-output': typeof ApiStructuredOutputRoute
'/api/summarize': typeof ApiSummarizeRoute
Expand All @@ -190,6 +198,7 @@ export interface FileRoutesById {
'/': typeof IndexRoute
'/image-gen': typeof ImageGenRoute
'/realtime': typeof RealtimeRoute
'/server-fn-chat': typeof ServerFnChatRoute
'/api/image-gen': typeof ApiImageGenRoute
'/api/structured-output': typeof ApiStructuredOutputRoute
'/api/summarize': typeof ApiSummarizeRoute
Expand All @@ -215,6 +224,7 @@ export interface FileRouteTypes {
| '/'
| '/image-gen'
| '/realtime'
| '/server-fn-chat'
| '/api/image-gen'
| '/api/structured-output'
| '/api/summarize'
Expand All @@ -238,6 +248,7 @@ export interface FileRouteTypes {
| '/'
| '/image-gen'
| '/realtime'
| '/server-fn-chat'
| '/api/image-gen'
| '/api/structured-output'
| '/api/summarize'
Expand All @@ -261,6 +272,7 @@ export interface FileRouteTypes {
| '/'
| '/image-gen'
| '/realtime'
| '/server-fn-chat'
| '/api/image-gen'
| '/api/structured-output'
| '/api/summarize'
Expand All @@ -285,6 +297,7 @@ export interface RootRouteChildren {
IndexRoute: typeof IndexRoute
ImageGenRoute: typeof ImageGenRoute
RealtimeRoute: typeof RealtimeRoute
ServerFnChatRoute: typeof ServerFnChatRoute
ApiImageGenRoute: typeof ApiImageGenRoute
ApiStructuredOutputRoute: typeof ApiStructuredOutputRoute
ApiSummarizeRoute: typeof ApiSummarizeRoute
Expand All @@ -307,6 +320,13 @@ export interface RootRouteChildren {

declare module '@tanstack/react-router' {
interface FileRoutesByPath {
'/server-fn-chat': {
id: '/server-fn-chat'
path: '/server-fn-chat'
fullPath: '/server-fn-chat'
preLoaderRoute: typeof ServerFnChatRouteImport
parentRoute: typeof rootRouteImport
}
'/realtime': {
id: '/realtime'
path: '/realtime'
Expand Down Expand Up @@ -461,6 +481,7 @@ const rootRouteChildren: RootRouteChildren = {
IndexRoute: IndexRoute,
ImageGenRoute: ImageGenRoute,
RealtimeRoute: RealtimeRoute,
ServerFnChatRoute: ServerFnChatRoute,
ApiImageGenRoute: ApiImageGenRoute,
ApiStructuredOutputRoute: ApiStructuredOutputRoute,
ApiSummarizeRoute: ApiSummarizeRoute,
Expand Down
110 changes: 110 additions & 0 deletions examples/ts-react-chat/src/routes/server-fn-chat.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,110 @@
import { useState } from 'react'
import { createFileRoute } from '@tanstack/react-router'
import { stream, useChat } from '@tanstack/ai-react'
import { Send, Square } from 'lucide-react'
import { chatFn } from '@/lib/server-fns'

export const Route = createFileRoute('/server-fn-chat')({
component: ServerFnChat,
})

/**
* Demonstrates wiring `useChat` to a TanStack Start server function.
*
* The server function (`chatFn` in `lib/server-fns.ts`) returns
* `toServerSentEventsResponse(chat({ ... }))` β€” an SSE Response. The
* `stream()` connection adapter awaits the server function, detects the
* Response, and parses SSE chunks into the chat client.
*
* Compare to `routes/index.tsx`, which uses `fetchServerSentEvents('/api/...')`
* against an HTTP route handler. Same wire format; different invocation style.
*/
function ServerFnChat() {
const { messages, sendMessage, isLoading, error, stop } = useChat({
connection: stream((messages) => chatFn({ data: { messages } })),
})
const [input, setInput] = useState('')

const handleSubmit = (e: React.FormEvent) => {
e.preventDefault()
if (!input.trim() || isLoading) return
void sendMessage(input)
setInput('')
}

return (
<div className="flex flex-col h-[calc(100vh-72px)] bg-gray-950 text-gray-100">
<div className="border-b border-gray-800 bg-gray-900/60 px-4 py-3">
<h2 className="text-lg font-semibold">Chat via server function</h2>
<p className="text-xs text-gray-400 mt-1">
<code className="text-cyan-400">
stream(() =&gt; chatFn(&#123; data &#125;))
</code>{' '}
β€” the server function returns an SSE{' '}
<code className="text-cyan-400">Response</code>; the adapter parses
it.
</p>
</div>

<div className="flex-1 overflow-y-auto p-4 space-y-3">
{messages.length === 0 && (
<p className="text-gray-500 text-sm">
Say something to start the chat.
</p>
)}
{messages.map((m) => (
<div
key={m.id}
className={`max-w-2xl rounded-lg px-3 py-2 ${
m.role === 'user'
? 'ml-auto bg-cyan-700/40 border border-cyan-600/40'
: 'mr-auto bg-gray-800 border border-gray-700'
}`}
>
{m.parts.map((part, i) =>
part.type === 'text' ? <span key={i}>{part.content}</span> : null,
)}
</div>
))}
{error && (
<div className="rounded-lg border border-red-700/60 bg-red-900/30 px-3 py-2 text-sm text-red-200">
{error.message}
</div>
)}
</div>

<form
onSubmit={handleSubmit}
className="border-t border-gray-800 bg-gray-900/80 p-3 flex gap-2"
>
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
placeholder="Message..."
disabled={isLoading}
className="flex-1 rounded-lg bg-gray-800 border border-gray-700 px-3 py-2 text-sm focus:outline-none focus:border-cyan-500"
/>
{isLoading ? (
<button
type="button"
onClick={stop}
className="px-3 py-2 rounded-lg bg-red-600 hover:bg-red-700 text-white"
aria-label="Stop"
>
<Square size={18} />
</button>
) : (
<button
type="submit"
disabled={!input.trim()}
className="px-3 py-2 rounded-lg bg-cyan-600 hover:bg-cyan-700 disabled:opacity-50 text-white"
aria-label="Send"
>
<Send size={18} />
</button>
)}
</form>
</div>
)
}
Loading
Loading