Skip to content

Unable to use llama offline #1198

@baconsalad

Description

@baconsalad

What happened?

I used to be able to get this to work each update by making some changes but now I cannot get it work at all. It just keeps spitting out Model CODE_QWEN_2_5_1_5B_Q8_0 does not support openai-endpoint-responses: Must be supported to use OpenAI responses params. . The code qwen 2.5 is close enough to qwen3.5 I use now, I don't know why it wants openai anything.

Relevant log output or stack trace

Steps to reproduce

No response

CodeGPT version

0.0

Operating System

Linux

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions