Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 41 additions & 0 deletions docs/tracing.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,6 +139,47 @@ await Runner.run(
## Additional notes
- View free traces at Openai Traces dashboard.

## Long-running workers

In long-running worker processes (Celery, FastAPI background tasks, RQ, Dramatiq), traces created
with `trace()` are buffered by `BatchTraceProcessor` but **may not be exported** before the next
task begins, because the worker process never exits to trigger shutdown flushing.

Use [`flush_traces()`][agents.flush_traces] after each task completes to ensure all buffered spans
are exported:

```python
from agents import Agent, Runner, trace, flush_traces
Comment on lines +148 to +152
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Remove flush_traces guidance until API is available

This new section tells users to call flush_traces() and links to [agents.flush_traces], but that public symbol is not defined/exported in the current SDK (it is absent from both src/agents/tracing/__init__.py and src/agents/__init__.py). In this state, users copying the snippet will hit an ImportError, and the reference link target is unresolved during docs generation, so the docs now prescribe a non-existent API.

Useful? React with 👍 / 👎.


# Celery
@celery_app.task
def process_document(doc_id: str) -> str:
with trace("process_document"):
agent = Agent(name="Processor", instructions="Process the document.")
result = Runner.run_sync(agent, f"Process document {doc_id}")
flush_traces() # export before the worker picks up the next task
return result.final_output
```

```python
# FastAPI background task
from fastapi import BackgroundTasks

async def run_agent_task(task_id: str):
with trace("background_task"):
agent = Agent(name="Worker", instructions="Complete the task.")
result = await Runner.run(agent, f"Task {task_id}")
flush_traces()

@app.post("/tasks")
async def create_task(background_tasks: BackgroundTasks):
background_tasks.add_task(run_agent_task, "task-123")
return {"status": "queued"}
```

Without calling `flush_traces()`, traces may be dropped or merged across task boundaries in
worker environments. See [issue #2135](https://github.com/openai/openai-agents-python/issues/2135)
for background.

## Ecosystem integrations

Expand Down