Skip to content

feat: unique fake responses ids#3079

Closed
bantucaravan wants to merge 4 commits intoopenai:mainfrom
bantucaravan:unique-fake-ids
Closed

feat: unique fake responses ids#3079
bantucaravan wants to merge 4 commits intoopenai:mainfrom
bantucaravan:unique-fake-ids

Conversation

@bantucaravan
Copy link
Copy Markdown

@bantucaravan bantucaravan commented May 1, 2026

Summary

I replaced the static fake id used to simulate a Responses API Response id or ResponseOutputItem id with a unique id. The unique ID includes a prefix that identifies it as fake. The fake id pattern is used for all conversions of ChatCompletions API output into Responses API format

Previously when using ChatCompletions APIs, every output message had the same ID value. You can imagine why this is annoying and problematic. Now all output items and responses have unique ids that are identifiable as fake (not created by the real OpenAI Responses API).

The only major complication came up with streaming responses. Here I added an output_message_id to the streaming state. This allows all ResponseOutputMessage events in a streaming response have the same fake id.
My understanding of the vision for the chat completions conversion is that every ChatCompletions streaming response is converted to produce a single Responses API ResponseOutputMessage item. Special attention from the reviewer on this point is appreciated.

Example:
Old -> '__fake_id__'
New -> '__fake_id__63456181-ca62-4520-9767-7d081aafac46'

Test Plan

Updated the tests to check the new fake ids.

For streaming checks, I added checks that ensured that all events associated with the same ResponseOutputMessage or function call or reasoning all have the same ID. The same is done for litellm. any_llm tests mock the whole ChatCmplStreamHandler.handle_stream() so no changed needed.
I added a new test to make sure that multiple function calls in a single streaming response get different but consistent ids.

For future work, it might make sense to break the streaming tests into tests for ChatCmplStreamHandler.handle_stream() doing most of the work and lighter tests for .stream_response() in chat_completions, litlellm, any_llm respectively.

bantucaravan and others added 4 commits May 1, 2026 01:04
replace with uid with fake id prefix

Co-authored-by: Copilot <copilot@github.com>
Co-authored-by: Copilot <copilot@github.com>
@github-actions github-actions Bot added enhancement New feature or request feature:core labels May 1, 2026
@seratch
Copy link
Copy Markdown
Member

seratch commented May 1, 2026

Thanks for the suggestion. We currently don't plan to make these changes, to avoid potential breaking changes to the SDK's behavior. That said, if we see a need for this in the future, we may revisit it as an opt-in feature.

@seratch seratch closed this May 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request feature:core

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants