You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Add proxy support and explicit model override for LLM pipeline
- Add ANTHROPIC_BASE_URL env var support in LLMManager and
AnthropicProvider — enables local proxies (e.g. localhost:3456)
- Add MODEL env var to cli.py config dict — was missing, so explicit
model selection via env var didn't reach LLMManager
- Skip model fallback chain when MODEL is explicitly set (not "auto") —
prevents hitting rate-limited Claude models when using alternative
models like gpt-5.3-codex through an Anthropic-compatible proxy
- Pass anthropic_base_url through config dict from cli.py
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
0 commit comments