Documentation Index
Fetch the complete documentation index at: https://docs.openclaw.ai/llms.txt
Use this file to discover all available pages before exploring further.
Model Providers
OpenClaw can use many LLM providers. Pick one, authenticate, then set the default model asprovider/model.
Quick start (two steps)
- Authenticate with the provider (usually via
openclaw onboard). - Set the default model:
Supported providers (starter set)
- Alibaba Model Studio
- Amazon Bedrock
- Anthropic (API + Claude CLI)
- BytePlus (International)
- Chutes
- ComfyUI
- Cloudflare AI Gateway
- DeepInfra
- fal
- Fireworks
- GLM models
- MiniMax
- Mistral
- Moonshot AI (Kimi + Kimi Coding)
- OpenAI (API + Codex)
- OpenCode (Zen + Go)
- OpenRouter
- Qianfan
- Qwen
- Runway
- StepFun
- Synthetic
- Vercel AI Gateway
- Venice (Venice AI)
- xAI
- Z.AI
Additional bundled provider variants
anthropic-vertex- implicit Anthropic on Google Vertex support when Vertex credentials are available; no separate onboarding auth choicecopilot-proxy- local VS Code Copilot Proxy bridge; useopenclaw onboard --auth-choice copilot-proxygoogle-gemini-cli- unofficial Gemini CLI OAuth flow; requires a localgeminiinstall (brew install gemini-cliornpm install -g @google/gemini-cli); default modelgoogle-gemini-cli/gemini-3-flash-preview; useopenclaw onboard --auth-choice google-gemini-clioropenclaw models auth login --provider google-gemini-cli --set-default