how to show thinking using local models? #3041
-
|
I'm using local qwen3.5, and opencode shows reasoning / thinking output for the same queries, while the codecompanion chat window - does not. I also tried setting: no effect |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
|
I don't think the |
Beta Was this translation helpful? Give feedback.
-
|
found it! see https://codecompanion.olimorris.dev/configuration/adapters-http#llama-cpp-with-reasoning-format-deepseek - it applies for qwen models too, not just deepseek |
Beta Was this translation helpful? Give feedback.
found it! see https://codecompanion.olimorris.dev/configuration/adapters-http#llama-cpp-with-reasoning-format-deepseek - it applies for qwen models too, not just deepseek