Quick start — Cloud
Up and running in under 90 seconds. The only thing you change in your app is one env var.
Subscribe
Choose Pro or Team at prxy.monster . Checkout is handled by Stripe.
Get your API key
When checkout succeeds, prxy.monster provisions your first key and emails it to the checkout address. Keys look like:
prxy_live_a1b2c3d4e5f6...Keep this secret. Treat it like a provider API key. Revoke and rotate leaked keys through the key API.
Point your app at the gateway
Two env vars. That’s the whole integration.
Anthropic SDK
export ANTHROPIC_BASE_URL=https://api.prxy.monster
export ANTHROPIC_API_KEY=prxy_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxximport Anthropic from '@anthropic-ai/sdk';
// SDK reads ANTHROPIC_BASE_URL + ANTHROPIC_API_KEY from env.
const client = new Anthropic();
const msg = await client.messages.create({
model: 'claude-sonnet-4-6',
max_tokens: 256,
messages: [{ role: 'user', content: 'hi' }],
});Verify
curl https://api.prxy.monster/v1/pipeline \
-H "Authorization: Bearer $ANTHROPIC_API_KEY"Returns the active module pipeline for your key. Default:
{
"configured": [],
"active": [
{ "name": "mcp-optimizer", "version": "1.0.0" },
{ "name": "semantic-cache", "version": "1.0.0" },
{ "name": "patterns", "version": "1.0.0" }
],
"override": null
}What just happened?
Every request through api.prxy.monster ran through the default pipeline:
| Module | What it did |
|---|---|
mcp-optimizer | Embedded each MCP tool’s description. Kept only the ones relevant to your prompt. |
semantic-cache | Embedded the request, looked for similar past requests. Skipped the provider call on a hit. |
patterns | Scanned the response for “the fix is X” type insights and saved them. |
You can replace any of those, add more, or strip them down to nothing.
Next steps
- Customize your pipeline — pick the modules you want.
- Browse all modules — what each one does and how to configure it.
- Recipes — pre-built pipelines for coding, support, research, etc.
Bring Your Own Key (BYOK) — your provider key (Anthropic / OpenAI / etc.) lives in your app, not on our servers. Tokens are billed directly by the provider. We charge for the gateway, never for tokens. See pricing.
Provider options
prxy.monster speaks five upstream providers out of the box:
- Anthropic —
claude-*models - OpenAI —
gpt-*,o1,o3,o4models - Google —
gemini-*models - Groq —
llama-*,mixtral-*,groq/*models - AWS Bedrock —
bedrock/<model-id>(Claude, Llama, Titan, Mistral, Cohere) — see Bedrock provider
The model field on the request is the routing signal — no provider config to set per request.