OpenClaw is an incredible agent runtime — 800k lines of code, full Linux toolbox, MCP support, and session persistence. But it’s single-tenant by design. One user, one instance, one machine.
What if you could give every user in your app their own OpenClaw agent — sandboxed, persistent, with network isolation — without running separate instances?
That’s what Lobu does. It wraps OpenClaw in multi-tenant infrastructure so you can embed it in your Next.js app with five lines of code. No Docker, no Kubernetes.
What you’ll build
A Next.js app where:
- Each user gets their own sandboxed OpenClaw agent with persistent workspace
- Agents have the full OpenClaw toolbox — file editing, shell, grep, MCP tools
- Network isolation — agents can only reach domains you allow
- Your app controls the HTTP server; Lobu handles the multi-tenant orchestration
Prerequisites
- Node.js 18+ (or Bun)
- A Next.js project (App Router)
- A Redis instance (local, Upstash, Dragonfly — anything works)
- An LLM provider API key (OpenRouter, OpenAI, Anthropic, etc.)
Step 1: Install
npm install @lobu/gateway# orbun add @lobu/gatewayStep 2: Create the catch-all route
Create app/api/lobu/[...path]/route.ts:
import { Lobu } from "@lobu/gateway";
const lobu = new Lobu({ redis: process.env.REDIS_URL!, adminPassword: process.env.LOBU_ADMIN_PASSWORD!, agents: [ { id: "assistant", name: "Assistant", identity: "You are a helpful assistant for our product.", soul: "Be concise. Ask clarifying questions when unsure.", providers: [{ id: "openrouter", key: process.env.OPENROUTER_API_KEY! }], skills: ["github"], network: { allowed: ["github.com", ".github.com"] }, }, ],});
const initialized = lobu.initialize();
async function handler(req: Request) { await initialized; return lobu.getApp().fetch(req);}
export const GET = handler;export const POST = handler;export const PUT = handler;export const DELETE = handler;That’s the entire integration. lobu.initialize() runs once; every request after that is handled by the Lobu gateway.
Step 3: Add environment variables
REDIS_URL=redis://localhost:6379OPENROUTER_API_KEY=sk-or-...LOBU_ADMIN_PASSWORD=change-meStep 4: Chat with your agent
Start your Next.js app and send a message:
npm run dev
curl -X POST http://localhost:3000/api/lobu/api/v1/agents/assistant/messages \ -H "Authorization: Bearer $LOBU_ADMIN_PASSWORD" \ -H "Content-Type: application/json" \ -d '{ "platform": "api", "content": "Hello, what can you do?" }'Or use the Lobu CLI after adding a local context and logging in once:
npx @lobu/cli@latest context add local --api-url http://localhost:3000/api/lobunpx @lobu/cli@latest login --admin-password -c localnpx @lobu/cli@latest chat "Hello" -c localWhat’s happening under the hood
When a user sends a message:
- Lobu creates an OpenClaw worker for that user session — an isolated process with its own workspace directory, virtual filesystem, and bash session
- OpenClaw runs the agent — the full runtime with
read,write,edit,bash,grep,find,ls, and MCP tools - Network requests go through the gateway proxy — the worker can only reach domains you’ve allowed (
github.comin our example) - MCP tools are proxied — the worker calls GitHub’s MCP tools through the gateway. Your API keys stay in the gateway process; the worker never sees them
- The workspace persists — files the agent creates survive across messages
All at ~50MB memory per user session. We’ve tested 300 concurrent instances on a single machine.
Adding platform connections
The same agent can also serve Slack, Telegram, or WhatsApp — just add connections:
const lobu = new Lobu({ redis: process.env.REDIS_URL!, agents: [ { id: "assistant", providers: [{ id: "openrouter", key: process.env.OPENROUTER_API_KEY! }], connections: [ { type: "slack", config: { botToken: process.env.SLACK_BOT_TOKEN!, signingSecret: process.env.SLACK_SIGNING_SECRET!, }, }, ], }, ],});Now every Slack channel and DM gets its own isolated agent — from the same deployment.
Adding evaluations
Before shipping, verify your agent works:
Create agents/assistant/evals/basic.yaml:
name: basicdescription: Agent responds helpfullytrials: 3turns: - content: "Hello, what can you help me with?" assert: - type: llm-rubric value: "Response is friendly and describes available capabilities"Run it:
npx @lobu/cli@latest eval --gateway http://localhost:3000/api/lobuWhat you get for free
By embedding Lobu, your app inherits:
- Per-user sandboxing — isolated workspaces, no cross-user leakage
- MCP proxy — credential isolation, OAuth token refresh, audit trail
- Network policy — domain-filtered egress per agent
- Scale-to-zero — idle workers shut down automatically
- 16+ LLM providers — swap models without code changes
- Built-in evals — automated quality testing across models
- OpenAPI docs — full API reference at
/api/lobu/api/docs
No Docker, no Kubernetes, no infrastructure to manage. Just install the package and mount it.
Next steps
- Embed in Your App — framework-specific examples for Express, Hono, Fastify, Bun
- Agent Workspace — customize prompt files and understand where skills, evals, and config live
- Skills — add built-in or local skills and custom MCP servers
- Evaluations — write eval suites for quality gates
- Comparison — how Lobu compares to DeepAgents Deploy and Claude Managed Agents