Hello !
When using CODEGRAPH_AGENT_ARCHITECTURE=lats with CODEGRAPH_LLM_PROVIDER=ollama, the agentic tools fail with:
LATS not yet supported for provider Ollama
The build_lats() function in crates/codegraph-mcp-rig/src/agent/builder.rs only has cases for OpenAI and Anthropic providers. Ollama falls through to the error case.
Fix: Add the Ollama case following the same pattern as other providers:
#[cfg(feature = "ollama")]
RigProvider::Ollama => {
let client = RigLLMAdapter::ollama_client();
let model = client.completion_model(&model_name);
Ok(Box::new(LatsAgent {
model,
factory,
max_turns: self.max_turns,
tier: self.tier,
}))
}
This compiles and allows LATS to be invoked with Ollama (confirmed via debug logs showing "framework":"AutoAgents-LATS").
Do you need me to open a PR for that ? Or is there a specific reason for not having LATS support for Ollama ?
Hello !
When using
CODEGRAPH_AGENT_ARCHITECTURE=latswithCODEGRAPH_LLM_PROVIDER=ollama, the agentic tools fail with:LATS not yet supported for provider OllamaThe
build_lats()function in crates/codegraph-mcp-rig/src/agent/builder.rs only has cases for OpenAI and Anthropic providers. Ollama falls through to the error case.Fix: Add the Ollama case following the same pattern as other providers:
This compiles and allows LATS to be invoked with Ollama (confirmed via debug logs showing "framework":"AutoAgents-LATS").
Do you need me to open a PR for that ? Or is there a specific reason for not having LATS support for Ollama ?