Skip to content

Incompatible with DeepSeek V4 (Pro/Flash): missing reasoning_content in context #1216

@feifeiai

Description

@feifeiai

What happened?

ProxyAI is incompatible with the newly released DeepSeek V4 series models (deepseek-v4-pro / deepseek-v4-flash).
Please note the requirements for multi-turn conversations with the latest DeepSeek V4 reasoning models:
If the model makes any tool calls between two user messages, the assistant's reasoning_content generated in between must be included in the context and passed back to the API for all subsequent user conversation rounds.
If your code fails to correctly pass back reasoning_content, the API will return a 400 error.

Please fix this bug as soon as possible.

Relevant log output or stack trace

Error from client: CustomOpenAILLMClient Message: Expected status code 200 but was 400 Status code: 400

Error: Subagent 'explore' failed with HTTP 400: Error from client: CustomOpenAILLMClient Status code: 400 Error body: {"error":{"message":"The `reasoning_content` in the thinking mode must be passed back to the API.","type":"invalid_request_error","param":null,"code":"invalid_request_error"}}

Steps to reproduce

No response

CodeGPT version

3.8.0-241.1

Operating System

None

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions