Minimax with OpenClaw

Configure Minimax M2.5 — API key or Ollama cloud

Minimax models are supported in OpenClaw, often praised for local-like performance and strong reasoning. You can use the Minimax API directly or access models via Ollama cloud (e.g. minimax-m2.5).

Models

  • M2.5 — Flagship model, strong across tasks
  • abab7 — Smaller, faster variant

Configure as minimax/minimax-m2.5 or similar. Check docs.openclaw.ai for current model IDs.

Two Ways to Use Minimax

1. Direct API

Get an API key from Minimax. Onboard with openclaw onboard --auth-choice minimax-api-key or configure the provider manually in your config.

2. Via Ollama Cloud

Ollama supports Minimax cloud models. Run:

Ollama cloud
ollama launch openclaw --model minimax-m2.5:cloud

No separate API key needed if you use Ollama. See Ollama + OpenClaw tutorial.

Basic Configuration (Direct API)

Example
{
  "agent": { "model": "minimax/minimax-m2.5" },
  "models": {
    "providers": {
      "minimax": {
        "apiKey": "${MINIMAX_API_KEY}"
      }
    }
  }
}

Store credentials via ~/.openclaw/credentials or openclaw secrets.

Why Minimax for OpenClaw

  • Local-like performance — Often feels snappy and responsive
  • Reasoning — Strong on complex tasks
  • Ollama integration — Easy path via Ollama cloud