Quick start
- Install β If OpenClaw isnβt installed, Ollama prompts to install it via npm
- Security β On the first launch, a security notice explains the risks of tool access
- Model β Pick a model from the selector (local or cloud)
- Onboarding β Ollama configures the provider, installs the gateway daemon, and sets your model as the primary
- Gateway β Starts in the background and opens the OpenClaw TUI
OpenClaw requires a larger context window. It is recommended to use a context window of at least 64k tokens if using local models. See Context length for more information.
Previously known as Clawdbot.
ollama launch clawdbot still works as an alias.Configure without launching
To change the model without starting the gateway and TUI:Recommended models
Cloud models:kimi-k2.5:cloudβ Multimodal reasoning with subagentsminimax-m2.5:cloudβ Fast, efficient coding and real-world productivityglm-5:cloudβ Reasoning and code generation
glm-4.7-flashβ Reasoning and code generation locally (~25 GB VRAM)

