@@ -30,16 +30,19 @@ Multi-runtime considerations:
3030## Span Hierarchy
3131
3232**Two span types:**
33+
3334- `gen_ai.invoke_agent` - Parent/pipeline spans (chains, agents, orchestration)
3435- `gen_ai.chat`, `gen_ai.generate_text`, etc. - Child spans (actual LLM calls)
3536
3637**Hierarchy example:**
38+
3739```
3840gen_ai.invoke_agent (ai.generateText)
3941 └── gen_ai.generate_text (ai.generateText.doGenerate)
4042```
4143
4244**References:**
45+
4346- Vercel AI: `packages/core/src/tracing/vercel-ai/constants.ts:8-23`
4447- LangChain: `packages/core/src/tracing/langchain/index.ts:199-207`
4548
@@ -50,6 +53,7 @@ gen_ai.invoke_agent (ai.generateText)
5053**Non-streaming:** Use `startSpan()`, set attributes immediately from response
5154
5255**Streaming:** Use `startSpanManual()` with this pattern:
56+
5357```typescript
5458interface StreamingState {
5559 responseTexts: string[]; // Accumulate fragments
@@ -74,11 +78,13 @@ async function* instrumentStream(stream, span, recordOutputs) {
7478```
7579
7680**Key rules:**
81+
7782- Accumulate with arrays/strings, don't overwrite
7883- Set `GEN_AI_RESPONSE_STREAMING_ATTRIBUTE: true`
7984- Call `span.end()` in finally block
8085
8186**References:**
87+
8288- OpenAI: `packages/core/src/tracing/openai/streaming.ts`
8389- Anthropic: `packages/core/src/tracing/anthropic-ai/streaming.ts`
8490- Detection: `packages/core/src/tracing/openai/index.ts:183-221`
@@ -88,11 +94,13 @@ async function* instrumentStream(stream, span, recordOutputs) {
8894## Token Accumulation
8995
9096**Child spans (LLM calls):** Set tokens directly from API response
97+
9198```typescript
9299setTokenUsageAttributes(span, inputTokens, outputTokens, totalTokens);
93100```
94101
95102**Parent spans (invoke_agent):** Accumulate from children using event processor
103+
96104```typescript
97105// First pass: accumulate from children
98106for (const span of event.spans) {
@@ -120,6 +128,7 @@ Location: `packages/core/src/tracing/ai/`
120128### `gen-ai-attributes.ts`
121129
122130OpenTelemetry Semantic Convention attribute names. **Always use these constants!**
131+
123132- `GEN_AI_SYSTEM_ATTRIBUTE` - 'openai', 'anthropic', etc.
124133- `GEN_AI_REQUEST_MODEL_ATTRIBUTE` - Model from request
125134- `GEN_AI_RESPONSE_MODEL_ATTRIBUTE` - Model from response
@@ -205,10 +214,12 @@ OpenTelemetry Semantic Convention attribute names. **Always use these constants!
205214**RULE:** AI SDKs should be auto-enabled in Node.js runtime if possible.
206215
207216✅ **Auto-enable if:**
217+
208218- SDK works in Node.js runtime
209219- OTel only patches when package imported (zero cost if unused)
210220
211221❌ **Don't auto-enable if:**
222+
212223- SDK is niche/experimental
213224- Integration has significant limitations
214225
@@ -257,6 +268,7 @@ export { {provider}Integration } from './integrations/tracing/{provider}';
257268```
258269
259270**4. Add E2E test** in `packages/node-integration-tests/suites/{provider}/`
271+
260272- Verify spans created automatically (no manual setup)
261273- Test `recordInputs` and `recordOutputs` options
262274- Test integration can be disabled
@@ -295,7 +307,7 @@ packages/
295307
2963081. **Respect `sendDefaultPii`** for recordInputs/recordOutputs
2973092. **Use semantic attributes** from `gen-ai-attributes.ts` (never hardcode)
298- 3. **Set Sentry origin**: `SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN = 'auto.ai.{provider}'`
310+ 3. **Set Sentry origin**: `SEMANTIC_ATTRIBUTE_SENTRY_ORIGIN = 'auto.ai.openai'` (use provider name: `openai`, `anthropic`, `vercelai`, etc. - only alphanumerics, `_`, and `.` allowed)
2993114. **Truncate large data**: Use helper functions from `utils.ts`
3003125. **Correct span operations**: `gen_ai.invoke_agent` for parent, `gen_ai.chat` for children
3013136. **Streaming**: Use `startSpanManual()`, accumulate state, call `span.end()`
0 commit comments