Skip to content

fix: reject failed responses stream terminals#3107

Merged
seratch merged 3 commits intoopenai:mainfrom
Aphroq:fix/responses-stream-terminal-failures
May 4, 2026
Merged

fix: reject failed responses stream terminals#3107
seratch merged 3 commits intoopenai:mainfrom
Aphroq:fix/responses-stream-terminal-failures

Conversation

@Aphroq
Copy link
Copy Markdown
Contributor

@Aphroq Aphroq commented May 4, 2026

Summary

Rejects Responses streaming terminal events with response.failed or response.incomplete instead of converting their payloads into successful final model responses.

The streamed runner still exposes the raw terminal event before failing, but it no longer turns failed or incomplete payloads into ModelResponse or final_output. The websocket get_response() path now raises the same ModelBehaviorError for these terminal event types.

Test plan

  • uv run pytest tests/models/test_openai_responses.py tests/test_agent_runner_streamed.py tests/test_responses_tracing.py -k 'rejects_failed_terminal_response_payload_events or failed_or_incomplete_terminal_event_creates_trace'
  • bash .agents/skills/code-change-verification/scripts/run.sh

Issue number

Closes #3106

Checks

  • I've added new tests (if relevant)
  • I've added/updated the relevant documentation
  • I've run make lint and make format
  • I've made sure tests pass

@github-actions github-actions Bot added bug Something isn't working feature:core labels May 4, 2026
@seratch
Copy link
Copy Markdown
Member

seratch commented May 4, 2026

@codex review

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 09b871030f

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread src/agents/run_internal/run_loop.py Outdated
@Aphroq Aphroq force-pushed the fix/responses-stream-terminal-failures branch from 09b8710 to 70ce829 Compare May 4, 2026 12:43
@Aphroq Aphroq force-pushed the fix/responses-stream-terminal-failures branch from 70ce829 to 38e62e3 Compare May 4, 2026 12:52
@Aphroq
Copy link
Copy Markdown
Contributor Author

Aphroq commented May 4, 2026

I re-audited this against #3106 and the Codex review feedback.

The latest update covers a few related points:

  • RunResultStreaming.stream_events() now drains already queued stream events only for Responses terminal failure/error exceptions before raising, so consumers still receive the terminal RawResponsesStreamEvent and then get the ModelBehaviorError.
  • Generic non-MaxTurnsExceeded exceptions keep the existing behavior and do not drain stale queued events.
  • In addition to response.failed / response.incomplete, the typed HTTP error event path is covered.
  • I also found the same class of issue in AnyLLMModel Responses direct streaming: it could still treat response.failed / response.incomplete as a final_response. That path now uses the same failure semantics and has test coverage.

@seratch
Copy link
Copy Markdown
Member

seratch commented May 4, 2026

@codex review

@chatgpt-codex-connector
Copy link
Copy Markdown

Codex Review: Didn't find any major issues. More of your lovely PRs please.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@seratch seratch added this to the 0.15.x milestone May 4, 2026
@seratch seratch merged commit 9b57f05 into openai:main May 4, 2026
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working feature:extensions

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Responses streaming treats failed and incomplete terminal events as success

2 participants