A lightweight gateway server implementing the OpenClaw Gateway Protocol v3. Connects AI models (Ollama, OpenAI-compatible APIs) to clients over WebSocket and HTTP, with MCP tool-use support.
MiniClaw exists for two reasons:
-
Test OpenClaw client apps without the full backend. MiniClaw implements the complete protocol v3 surface -- handshake, auth, sessions, agent streaming, presence, all 80+ RPC methods -- so you can develop and test OpenClaw clients against a real, running server without spinning up the entire OpenClaw stack. Demo mode works with zero dependencies.
-
Run a micro agent compatible with OpenClaw tooling. Point MiniClaw at Ollama or any OpenAI-compatible API and you get a standalone agent that speaks the same protocol as OpenClaw. Existing clients, dashboards, and integrations built for OpenClaw just work.
- WebSocket RPC -- Full protocol v3 handshake, auth (token/password), presence, sessions, agent runs with streaming events
- HTTP API -- OpenAI-compatible
/v1/chat/completionsendpoint (streaming + non-streaming) - Provider backends -- Ollama native API, any OpenAI-compatible API (OpenRouter, local vLLM, etc.)
- MCP integration -- Acts as both MCP server (expose chat/session tools via stdio) and MCP client (connect to external tool servers)
- Tool use -- Models can call tools mid-conversation with automatic result injection and multi-turn loops
- Session management -- Multi-session chat history, inject/reset/delete/patch, idempotency deduplication
- Demo mode -- Runs without any model backend using keyword-matched responses
# Install dependencies
bun install
# Run in demo mode (no model needed)
bun run index.ts
# Run with Ollama
ollama serve &
ollama pull qwen3:4b
bun run index.ts --ollama
# Run with a specific model
bun run index.ts --ollama --model llama3.2:3bThe server starts on ws://localhost:8080 by default.
OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=qwen3:4b bun run index.ts --ollamaCopy the example config and fill in your credentials:
cp openclaw.json.example openclaw.json{
"agents": {
"defaults": {
"model": { "primary": "openrouter/deepseek/deepseek-chat-v3-0324" }
}
},
"models": {
"providers": {
"openrouter": {
"baseUrl": "https://openrouter.ai/api/v1",
"apiKey": "${OPENROUTER_API_KEY}",
"api": "openai-completions",
"models": [
{ "id": "deepseek/deepseek-chat-v3-0324", "name": "DeepSeek V3" }
]
}
}
}
}API keys use ${ENV_VAR} syntax -- set the env var, don't hardcode secrets.
OPENROUTER_API_KEY=sk-or-... bun run index.tsThe model ref format is provider/model-id. Pass --model to override the default:
bun run index.ts --model openrouter/google/gemini-2.5-flash-previewMiniClaw can run as an MCP server over stdio, exposing chat, clear_session, and list_models tools. This lets Claude Desktop, Cursor, or any MCP client talk to your local models.
bun run mcp-server.tsAdd to your MCP client config:
{
"mcpServers": {
"miniclaw": {
"command": "bun",
"args": ["run", "/path/to/miniclaw/mcp-server.ts"]
}
}
}MiniClaw can also connect to external MCP tool servers, making their tools available to the model. Create a mcp.json:
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]
}
}
}bun run index.ts --ollama --mcpThe /v1/chat/completions endpoint is OpenAI-compatible:
# Non-streaming
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"messages": [{"role": "user", "content": "hello"}], "stream": false}'
# Streaming
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Accept: text/event-stream" \
-d '{"messages": [{"role": "user", "content": "hello"}], "stream": true}'If authToken is configured, pass Authorization: Bearer <token>.
Connect via WebSocket and complete the handshake:
// Server sends: hello + connect.challenge
// Client sends:
{"type":"req","id":"1","method":"connect","params":{
"minProtocol":3,"maxProtocol":3,
"client":{"id":"my-app","version":"1.0.0","platform":"web","mode":"operator"}
}}
// Server responds: hello-ok with features, snapshot, policy
// Send a chat message:
{"type":"req","id":"2","method":"chat.send","params":{
"sessionKey":"main","message":"What is 2+2?","idempotencyKey":"abc123"
}}
// Server streams: agent events (lifecycle, reasoning, tool, assistant) + chat events (delta, final)See protocol-spec.md for the full specification.
# Run tests (137 tests)
bun test
# Type check
bun run typecheck
# Lint
bun run lint
# Test with coverage
bun run test:coverageThis repo includes a Playwright smoke test that:
- launches MiniClaw (demo mode) on
ws://127.0.0.1:18080 - launches MobileClaw
- opens MobileClaw in detached mode with
?detached=1&url=... - sends a message
- waits for the assistant response
Prereqs:
# miniclaw deps
bun installBy default, the e2e harness will clone wende/mobileclaw into
/Users/wende/projects/mobileclaw if missing, then run pnpm install.
Run:
# Install browser once
bunx playwright install chromium
# From miniclaw repo
bun run test:e2eOptional overrides:
MOBILECLAW_START_CMD(default:pnpm dev --port 3000)MINICLAW_START_CMD(default:bun run index.ts --config e2e/openclaw.e2e.json)MOBILECLAW_BASE_URL(default:http://127.0.0.1:3000)MOBILECLAW_REUSE_EXISTING(default:1; set to0to force a fresh launch)MOBILECLAW_GIT_URL(default:https://github.com/wende/mobileclaw.git)MOBILECLAW_GIT_REF(optional branch/tag to clone/checkout)MOBILECLAW_INSTALL_CMD(default:pnpm install)MOBILECLAW_AUTO_PULL(default:0; set to1togit pull --ff-onlyon existing checkout)MOBILECLAW_PORT/MINICLAW_PORT
index.ts # CLI entrypoint (WebSocket server)
mcp-server.ts # MCP stdio entrypoint (WebSocket + MCP server)
openclaw.json.example # Provider config template
mcp.json # MCP client tool server config
protocol-spec.md # OpenClaw Gateway Protocol v3 specification
e2e/ # Playwright e2e smoke test + e2e config
src/
server.ts # Core WebSocket server, RPC routing, session/run management
server.test.ts # Tests (137 tests covering all protocol methods)
ollama.ts # Ollama native API streaming handler + mock tools
openai-compat.ts # OpenAI-compatible API streaming handler
config.ts # openclaw.json loader and model resolution
mcp-server.ts # MCP server setup (chat/clear/list tools)
mcp-client.ts # MCP client manager (connect to external tool servers)
types.ts # Protocol frame types, config types
demo-responses.ts # Keyword-matched demo responses for no-model mode
MIT