Skip to Content
prxy.monster v1 is live. Start with Pro or Team →
IntegrationsUsing prxy.monster with OpenClaw

Using prxy.monster with OpenClaw

OpenClaw  is a long-running personal agent stack: channels, skills, tools, memory, and multiple agents behind one local gateway. That makes it a natural fit for prxy.monster. Put PRXY in front of OpenClaw once, and every OpenClaw agent gets the same caching, budget controls, pattern memory, and module pipeline.

Setup time: 2 minutes.

Configure

OpenClaw supports custom model providers through models.providers. Add a prxy provider in ~/.openclaw/openclaw.json:

{ env: { PRXY_API_KEY: "prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx", }, agents: { defaults: { model: { primary: "prxy/claude-sonnet-4-6" }, }, }, models: { mode: "merge", providers: { prxy: { baseUrl: "https://api.prxy.monster/v1", apiKey: "${PRXY_API_KEY}", api: "openai-completions", models: [ { id: "claude-sonnet-4-6", name: "Claude Sonnet via PRXY", input: ["text"], reasoning: true, contextWindow: 200000, maxTokens: 8192, }, { id: "gpt-4o", name: "GPT-4o via PRXY", input: ["text", "image"], reasoning: false, contextWindow: 128000, maxTokens: 4096, }, ], }, }, }, }

Then restart OpenClaw or let its gateway hot reload pick up the config.

OpenClaw config is JSON5, so comments and trailing commas are OK. OpenClaw validates strictly. If it refuses to start, run openclaw doctor and it will point at the exact field.

Existing config

If you already have channels, skills, or agents configured, do not replace the whole file. Merge only these pieces:

{ env: { PRXY_API_KEY: "prxy_live_xxxxxxxxxxxxxxxxxxxxxxxx", }, models: { mode: "merge", providers: { prxy: { baseUrl: "https://api.prxy.monster/v1", apiKey: "${PRXY_API_KEY}", api: "openai-completions", models: [{ id: "claude-sonnet-4-6", name: "Claude Sonnet via PRXY" }], }, }, }, }

Then set whichever agent should use PRXY:

{ agents: { defaults: { model: { primary: "prxy/claude-sonnet-4-6" }, }, }, }

Verify

curl https://api.prxy.monster/health openclaw doctor openclaw models list --provider prxy openclaw models set prxy/claude-sonnet-4-6

Send any OpenClaw message. If the agent responds, its model call is flowing through prxy.monster.

For always-on OpenClaw agents:

PRXY_PIPE=mcp-optimizer,semantic-cache,patterns,ipc,guardrails,cost-guard

Set this on the PRXY API key in cloud mode. In local mode, set it in the prxy.monster local container environment.

What OpenClaw users get

OpenClaw workloadPRXY module
Lots of tools, skills, and provider schemasmcp-optimizer trims irrelevant tool definitions before the model call.
Repeated personal workflows across channelspatterns learns useful fixes and reinjects them later.
Same questions from WhatsApp, Slack, terminal, or web chatsemantic-cache can return repeated work without another provider call.
Long-running agents that accumulate contextipc compresses older turns instead of dropping the session.
Personal data in prompts and tool outputsguardrails redacts or blocks sensitive content before it leaves the gateway.
Runaway loops and expensive agent runscost-guard enforces hard spend caps.

Cloud or local

Cloud:

baseUrl: "https://api.prxy.monster/v1"

Local:

baseUrl: "http://localhost:3099/v1"

Use local mode when you want the OpenClaw host and PRXY gateway on the same machine. Use cloud mode when you want one PRXY key and module pipeline shared across multiple OpenClaw hosts.

Notes

  • OpenClaw model refs become prxy/<model-id> because the provider id is prxy.
  • PRXY speaks the OpenAI-compatible /v1/chat/completions shape here, so use api: "openai-completions".
  • Native OpenClaw provider-specific behavior for direct OpenAI/OpenRouter routes does not apply to a custom PRXY provider. PRXY handles routing, caching, guardrails, and modules after the request reaches the gateway.

Full example

Copy-paste config and setup notes: github.com/Ekkos-Technologies-Inc/prxy-monster-examples/tree/main/examples/openclaw-setup 

OpenClaw’s config surface moves quickly. The stable integration point is the custom provider pattern: models.providers.prxy.baseUrl, apiKey, api: “openai-completions”, and a prxy/<model-id> model ref.

Last updated on