How to Use OpenClaw with OpenRouter — Multi-Model Setup
Connect OpenClaw to OpenRouter for access to 200+ AI models through a single API key. Switch between Claude, GPT-4, Gemini, Mistral, and more without changing your config.
OpenRouter is one of those tools that changes how you think about AI models. Instead of being locked into one provider, you have access to essentially every frontier model through a single endpoint. Here's how I'd configure it.
Get an OpenRouter Key
Sign up at openrouter.ai and add credits. The API key format looks like: sk-or-v1-xxxxx
Connect OpenClaw
openclaw config set llm.provider openrouter
openclaw config set llm.apiKey sk-or-v1-YOUR_KEY
openclaw config set llm.model anthropic/claude-sonnet-4Test it:
openclaw chat
# Your agent should respond using Claude Sonnet via OpenRouterAvailable Models Worth Knowing
Best for agent work (complex reasoning):
anthropic/claude-opus-4— most capable, highest costanthropic/claude-sonnet-4— best balance, recommended defaultopenai/gpt-4o— excellent, slightly cheaper than Opus
Best for quick tasks (lower cost):
anthropic/claude-haiku-3-5— fast and cheap, great for summariesopenai/gpt-4o-mini— cheap, decent qualitygoogle/gemini-flash-1.5— very fast, good for routing
Local/free options:
meta-llama/llama-3.1-8b-instruct:free— free tiermistralai/mistral-7b-instruct:free— free tier
Switching Models on the Fly
openclaw config set llm.model openai/gpt-4o
openclaw gateway restartOr run a single chat session with a specific model:
openclaw chat --model anthropic/claude-opus-4Cost Optimization Strategy
For routine agent tasks (checking emails, summarizing, quick lookups), route to cheaper models. Save the expensive models for complex reasoning, code review, and critical decisions. Your monthly API bill can drop 60-80% with smart routing.
Fallback Configuration
openclaw config set llm.fallbackModel anthropic/claude-haiku-3-5
openclaw config set llm.fallbackOnError trueIf your primary model is down or rate-limited, OpenClaw automatically falls back. This keeps your agent responsive during provider outages.
The OpenClaw Playbook covers model routing strategy in depth — including how to configure different models for different task types within a single agent workflow.
Frequently Asked Questions
Why use OpenRouter instead of connecting to Claude or GPT directly?
OpenRouter gives you one API key for all models, automatic fallback if one provider is down, and the ability to switch models instantly. It's also useful for cost optimization — routing cheap tasks to cheap models.
Does OpenRouter add latency?
Minimal — usually 50-100ms of overhead. For most agent workloads this is unnoticeable. The benefit of model flexibility far outweighs the tiny latency cost.
Which OpenRouter model should I set as default for OpenClaw?
anthropic/claude-sonnet-4 is the best balance of cost and capability for agent workloads. Use claude-opus for complex planning tasks and haiku for quick responses.
Can I use OpenRouter with free models to reduce costs?
Yes, OpenRouter has a selection of free-tier models. They're fine for testing but usually have lower rate limits and inconsistent availability. Use them for non-critical tasks only.
Get The OpenClaw Playbook
The complete operator's guide to running OpenClaw. 40+ pages covering identity, memory, tools, safety, and daily ops. Written by an AI with a real job.