diff --git a/.changeset/ai-metadata-public-api.md b/.changeset/ai-metadata-public-api.md new file mode 100644 index 00000000..f0bf74ec --- /dev/null +++ b/.changeset/ai-metadata-public-api.md @@ -0,0 +1,5 @@ +--- +'evlog': minor +--- + +Expose AI SDK execution metadata as a public API on `AILogger`. Three new methods let app code read the same data that gets attached to wide events: `getMetadata()` returns an immutable snapshot of the run (model, provider, tokens, calls, steps, tool calls, cost, finish reason, embeddings), `getEstimatedCost()` returns the dollar cost computed from the configured pricing map, and `onUpdate(cb)` subscribes to incremental snapshots emitted on every step, embedding, error, and integration finish (returns an unsubscribe function). New types `AIMetadata` (alias for `AIEventData`) and `AIMetadataListener` are exported. `model` and `provider` on `AIMetadata` are now optional to reflect early-snapshot reality (e.g. embedding-only runs). diff --git a/apps/docs/content/2.logging/5.ai-sdk.md b/apps/docs/content/2.logging/5.ai-sdk.md index f4b66687..f98a9b86 100644 --- a/apps/docs/content/2.logging/5.ai-sdk.md +++ b/apps/docs/content/2.logging/5.ai-sdk.md @@ -115,12 +115,15 @@ Your wide event now includes: ## How It Works -`createAILogger(log, options?)` returns an `AILogger` with two methods: +`createAILogger(log, options?)` returns an `AILogger` with the following methods: | Method | Description | |--------|-------------| | `wrap(model)` | Wraps a language model with middleware. Accepts a model string (e.g. `'anthropic/claude-sonnet-4.6'`) or a `LanguageModelV3` object. Works with `generateText`, `streamText`, and `ToolLoopAgent`. Also works with pre-wrapped models (e.g. from supermemory). | | `captureEmbed(result)` | Manually captures token usage, model info, and dimensions from `embed()` or `embedMany()` results (embedding models use a different type). | +| `getMetadata()` | Returns a snapshot of the current execution metadata (`AIMetadata`) — same shape as the `ai` field on the wide event. Safe to call inside `onFinish`, after `await generateText()`, or while a stream is in progress. | +| `getEstimatedCost()` | Returns the current estimated cost in dollars, or `undefined` if no `cost` map was provided. Convenience for `getMetadata().estimatedCost`. | +| `onUpdate(callback)` | Subscribes to metadata updates. Fires on every step, every `captureEmbed` call, on errors, and on `createEvlogIntegration`'s `onFinish`. Returns an unsubscribe function. | The middleware intercepts calls at the provider level. It does not touch your callbacks, prompts, or responses. Captured data flows through the normal evlog pipeline (sampling, enrichers, drains) and ends up in Axiom, Better Stack, or wherever you drain to. @@ -360,6 +363,128 @@ import { anthropic } from '@ai-sdk/anthropic' const model = ai.wrap(anthropic('claude-sonnet-4.6')) ``` +## Accessing Metadata in Your Code + +The wide event already contains the full metadata object, but you often want the same data inside your handler — to persist it, surface it to end-users, bill against it, or stream incremental progress to the client. + +`AILogger` exposes three methods for that, with no need to touch internal state: + +### `getMetadata()` — final snapshot + +Returns a structured `AIMetadata` object that mirrors the `ai` field on the wide event. Safe to call at any point, including after the run completes or inside the AI SDK's `onFinish`: + +```typescript [server/api/chat.post.ts] +import { useLogger } from 'evlog' +import { createAILogger } from 'evlog/ai' +import { generateText } from 'ai' + +export default defineEventHandler(async (event) => { + const log = useLogger(event) + const ai = createAILogger(log, { + cost: { 'claude-sonnet-4.6': { input: 3, output: 15 } }, + }) + + await generateText({ + model: ai.wrap('anthropic/claude-sonnet-4.6'), + prompt: 'Summarize this document', + }) + + const metadata = ai.getMetadata() + + await db.aiRuns.insert({ + userId: event.context.userId, + model: metadata.model, + inputTokens: metadata.inputTokens, + outputTokens: metadata.outputTokens, + estimatedCost: metadata.estimatedCost, + finishReason: metadata.finishReason, + responseId: metadata.responseId, + }) + + return { ok: true } +}) +``` + +The snapshot is a fresh copy: mutating it never affects the underlying state or subsequent calls. + +### `getEstimatedCost()` — quick cost check + +Convenience for `getMetadata().estimatedCost`. Returns the cost in dollars, or `undefined` if no `cost` map was provided or the model is not in the map. + +```typescript +const ai = createAILogger(log, { + cost: { 'claude-sonnet-4.6': { input: 3, output: 15 } }, +}) + +await generateText({ model: ai.wrap('anthropic/claude-sonnet-4.6'), prompt }) + +const cost = ai.getEstimatedCost() +console.log(`This call cost $${cost?.toFixed(4)}`) +``` + +### `onUpdate(callback)` — incremental updates + +Subscribe to metadata updates. The callback fires every time the underlying state flushes: + +- Once per step in multi-step agent runs +- Once per `captureEmbed` call +- On model errors +- On `createEvlogIntegration`'s `onFinish` + +Each invocation receives a fresh snapshot. Returns an unsubscribe function. Subscriber errors are isolated and never break the AI flow. + +```typescript [server/api/agent.post.ts] +import { ToolLoopAgent, createAgentUIStreamResponse, stepCountIs } from 'ai' +import { useLogger } from 'evlog' +import { createAILogger } from 'evlog/ai' + +export default defineEventHandler(async (event) => { + const log = useLogger(event) + const { messages } = await readBody(event) + const ai = createAILogger(log) + + ai.onUpdate((metadata) => { + pushToClient(event, { + type: 'ai-progress', + step: metadata.steps, + tokens: metadata.totalTokens, + cost: metadata.estimatedCost, + }) + }) + + const agent = new ToolLoopAgent({ + model: ai.wrap('anthropic/claude-sonnet-4.6'), + tools: { searchWeb, queryDatabase }, + stopWhen: stepCountIs(5), + }) + + return createAgentUIStreamResponse({ agent, uiMessages: messages }) +}) +``` + +For one-off cleanup: + +```typescript +const off = ai.onUpdate((metadata) => { /* ... */ }) +// later +off() +``` + +### `AIMetadata` shape + +`AIMetadata` is a public type alias for the snapshot returned by `getMetadata()` and passed to `onUpdate` listeners. It has the same shape as the `ai` field on the wide event — see [Captured Data](#captured-data) for the full field reference. + +```typescript +import type { AIMetadata, AIMetadataListener } from 'evlog/ai' + +function handleProgress(metadata: AIMetadata) { + console.log(`${metadata.calls} calls, $${metadata.estimatedCost ?? 0}`) +} + +const listener: AIMetadataListener = handleProgress +ai.onUpdate(listener) +``` + ## Telemetry Integration For deeper observability — tool execution timing, success/failure tracking, and total generation wall time — use `createEvlogIntegration()`. It implements the AI SDK's `TelemetryIntegration` interface and captures data that middleware alone cannot see. diff --git a/apps/nuxthub-playground/app/components/AiChat.vue b/apps/nuxthub-playground/app/components/AiChat.vue index d9326b10..25fe848a 100644 --- a/apps/nuxthub-playground/app/components/AiChat.vue +++ b/apps/nuxthub-playground/app/components/AiChat.vue @@ -48,6 +48,25 @@ function getToolInput(part: any): { query?: string } { function getToolOutput(part: any): { count?: number, error?: string } | undefined { return part.output } + +interface AiMessageMetadata { + calls?: number + totalTokens?: number + estimatedCost?: number + finishReason?: string +} + +function getAiMetadata(message: any): AiMessageMetadata | undefined { + const meta = message?.metadata as AiMessageMetadata | undefined + if (!meta || (meta.calls === undefined && meta.totalTokens === undefined)) return undefined + return meta +} + +function formatCost(cost: number | undefined): string { + if (cost === undefined) return '—' + if (cost === 0) return '$0' + return `$${cost.toFixed(6)}` +}