Skip to main content

Kilo Gateway

Kilo Gateway provides a unified API that routes requests to many models behind a single endpoint and API key. It is OpenAI-compatible, so most OpenAI SDKs work by switching the base URL.
PropertyValue
Providerkilocode
AuthKILOCODE_API_KEY
APIOpenAI-compatible
Base URLhttps://api.kilo.ai/api/gateway/

Getting started

1

Create an account

Go to app.kilo.ai, sign in or create an account, then navigate to API Keys and generate a new key.
2

Run onboarding

openclaw onboard --auth-choice kilocode-api-key
Or set the environment variable directly:
export KILOCODE_API_KEY="<your-kilocode-api-key>" # pragma: allowlist secret
3

Verify the model is available

openclaw models list --provider kilocode

Default model

The default model is kilocode/kilo/auto, a provider-owned smart-routing model managed by Kilo Gateway.
OpenClaw treats kilocode/kilo/auto as the stable default ref, but does not publish a source-backed task-to-upstream-model mapping for that route. Exact upstream routing behind kilocode/kilo/auto is owned by Kilo Gateway, not hard-coded in OpenClaw.

Available models

OpenClaw dynamically discovers available models from the Kilo Gateway at startup. Use /models kilocode to see the full list of models available with your account. Any model available on the gateway can be used with the kilocode/ prefix:
Model refNotes
kilocode/kilo/autoDefault — smart routing
kilocode/anthropic/claude-sonnet-4Anthropic via Kilo
kilocode/openai/gpt-5.4OpenAI via Kilo
kilocode/google/gemini-3-pro-previewGoogle via Kilo
…and many moreUse /models kilocode to list all
At startup, OpenClaw queries GET https://api.kilo.ai/api/gateway/models and merges discovered models ahead of the static fallback catalog. The bundled fallback always includes kilocode/kilo/auto (Kilo Auto) with input: ["text", "image"], reasoning: true, contextWindow: 1000000, and maxTokens: 128000.

Config example

{
  env: { KILOCODE_API_KEY: "<your-kilocode-api-key>" }, // pragma: allowlist secret
  agents: {
    defaults: {
      model: { primary: "kilocode/kilo/auto" },
    },
  },
}
Kilo Gateway is documented in source as OpenRouter-compatible, so it stays on the proxy-style OpenAI-compatible path rather than native OpenAI request shaping.
  • Gemini-backed Kilo refs stay on the proxy-Gemini path, so OpenClaw keeps Gemini thought-signature sanitation there without enabling native Gemini replay validation or bootstrap rewrites.
  • Kilo Gateway uses a Bearer token with your API key under the hood.
Kilo’s shared stream wrapper adds the provider app header and normalizes proxy reasoning payloads for supported concrete model refs.
kilocode/kilo/auto and other proxy-reasoning-unsupported hints skip reasoning injection. If you need reasoning support, use a concrete model ref such as kilocode/anthropic/claude-sonnet-4.
  • If model discovery fails at startup, OpenClaw falls back to the bundled static catalog containing kilocode/kilo/auto.
  • Confirm your API key is valid and that your Kilo account has the desired models enabled.
  • When the Gateway runs as a daemon, ensure KILOCODE_API_KEY is available to that process (for example in ~/.openclaw/.env or via env.shellEnv).

Model selection

Choosing providers, model refs, and failover behavior.

Configuration reference

Full OpenClaw configuration reference.

Kilo Gateway

Kilo Gateway dashboard, API keys, and account management.