mirror of
https://github.com/ultraworkers/claw-code.git
synced 2026-04-11 02:20:18 +08:00
When OPENAI_BASE_URL is set, the user explicitly configured an OpenAI-compatible endpoint (Ollama, LM Studio, vLLM, etc.). Model names like 'qwen2.5-coder:7b' or 'llama3:latest' don't match any recognized prefix, so detect_provider_kind() fell through to Anthropic — asking for Anthropic credentials even though the user clearly intended a local provider. Now: OPENAI_BASE_URL + OPENAI_API_KEY beats Anthropic env-check in the cascade. OPENAI_BASE_URL alone (no API key — common for Ollama) is a last-resort fallback before the Anthropic default. Source: MaxDerVerpeilte in #claw-code (Ollama + qwen2.5-coder:7b); traced by gaebal-gajae.