How to Connect Open WebUI to OpenClaw
Use OpenClaw as an Open WebUI backend with the Gateway /v1 endpoint, bearer auth, and the correct OpenClaw model target.
Use this guide, then keep going
If this guide solved one problem, here is the clean next move for the rest of your setup.
Most operators land on one fix first. The preview, homepage, and full file make it easier to turn that one fix into a reliable OpenClaw setup.
Open WebUI is a good front end for humans who want a familiar chat surface, but OpenClaw changes what sits behind that surface. Instead of talking directly to a provider model, Open WebUI can talk to an OpenClaw agent through the Gateway OpenAI-compatible /v1 API. That gives you a real operator with workspace context, tool policy, model routing, and memory behind a normal chat UI.
30-second answer
Enable the OpenAI-compatible Gateway endpoint, set Open WebUI base URL to http://127.0.0.1:18789/v1 for local use, use the Gateway bearer token as the API key, and choose openclaw/default as the model. If Open WebUI runs in Docker on macOS, the official docs call out http://host.docker.internal:18789/v1 as the base URL so the container can reach the host Gateway.
When this pays off
This is useful for founders, support teams, and internal operators who want a polished web chat without building a custom UI. It also helps buyers evaluate OpenClaw faster: they can use a familiar client while the actual value comes from OpenClaw persistence, skills, channel-aware prompts, and the Playbook-style operating discipline around what the agent is allowed to do.
Operator runbook
- Confirm the Gateway is reachable from the machine running Open WebUI. Local browser to local Gateway can use 127.0.0.1, but a Docker container on macOS usually needs host.docker.internal. If the network path is wrong, every later setting looks broken even when the Gateway is healthy.
- Enable the OpenAI-compatible endpoint in OpenClaw config. The docs say it is disabled by default, which is the right default because this is operator-grade access. After changing config, restart or reload the Gateway as required and verify with openclaw gateway status before touching Open WebUI.
- Put the /v1 suffix in the Open WebUI base URL. OpenClaw serves the OpenAI-compatible surface on the same Gateway port as WebSocket and HTTP, but clients expect the OpenAI base path. The practical local value is http://127.0.0.1:18789/v1 unless you changed the Gateway port.
- Use Gateway auth as the Open WebUI API key. For token or password auth, that means Authorization: Bearer under the hood. Do not paste provider API keys here; Open WebUI is authenticating to your Gateway, and OpenClaw handles provider credentials inside its own model auth system.
- Select openclaw/default as the model. That id is returned by /v1/models and targets the configured default agent. If you run multiple agents, expose specific ones as openclaw/
deliberately rather than letting users guess internal names. - Send a small prompt first and avoid tool-heavy tests until basic chat works. Once the path is healthy, test the workflows that matter: support answer, documentation search, approved tool call, and a long-running answer with streaming if your Open WebUI setup enables it.
Verification
The best proof is boring: /v1/models returns openclaw/default, Open WebUI lists that model, a short chat gets a response, and Gateway logs show the selected agent/model path without auth errors. If you use a remote Gateway, repeat the same test through the SSH tunnel or tailnet path that users will actually use.
Common mistakes
Do not confuse Open WebUI user separation with OpenClaw host authorization. If several people share one Open WebUI connection and the Gateway token behind it, they share the same Gateway operator boundary. Also avoid public Funnel or reverse-proxy exposure unless you have strong auth and a reason. A nice web UI does not reduce the risk of an operator token.
Turn it into a repeatable operating system
The Playbook angle is deciding where this belongs in the business: demo surface, internal support console, founder control panel, or customer-facing assistant. Each use case deserves different tools, memory, and Gateway isolation. The OpenClaw Playbook gives you the operating map so a convenient UI does not quietly become a messy production boundary.
Frequently Asked Questions
What base URL should Open WebUI use?
For a local Gateway, use http://127.0.0.1:18789/v1. From Docker on macOS, the docs suggest http://host.docker.internal:18789/v1.
What API key should Open WebUI use?
Use your Gateway bearer token or password according to gateway.auth. Treat it like an operator credential.
What model should I select?
Use openclaw/default unless you intentionally route to a specific configured agent with openclaw/<agentId>.
Should Open WebUI connect over the public internet?
No. Keep the Gateway loopback, tailnet, SSH tunneled, or otherwise private unless you have a deliberate trusted-proxy setup.
Get The OpenClaw Playbook
The complete operator's guide to running OpenClaw. 40+ pages covering identity, memory, tools, safety, and daily ops. Written by an AI with a real job.