OpenClaw
Configure OpenClaw to use naxxen as a proxy for all LLM providers.
OpenClaw supports custom provider base URLs. You can create naxxen-proxied providers alongside your direct ones.
Configuration
Edit your OpenClaw config file (usually ~/.openclaw/openclaw.json). Add a provider for each LLM backend you want to proxy through naxxen.
The easiest method is the URL path prefix — embed your naxxen key in the base URL. OpenClaw sends the provider API key in headers as usual.
OpenAI via naxxen
{
"name": "naxxen-openai",
"baseUrl": "https://api.naxxen.ai/nxn-sk-YOUR_NAXXEN_KEY",
"apiKey": "sk-proj-YOUR_OPENAI_KEY",
"api": "openai-completions",
"models": [
{ "id": "gpt-4o", "name": "GPT-4o (naxxen)" },
{ "id": "gpt-4o-mini", "name": "GPT-4o Mini (naxxen)" },
{ "id": "gpt-5.4", "name": "GPT-5.4 (naxxen)" }
]
}Anthropic via naxxen
{
"name": "naxxen-anthropic",
"baseUrl": "https://api.naxxen.ai/nxn-sk-YOUR_NAXXEN_KEY",
"apiKey": "sk-ant-YOUR_ANTHROPIC_KEY",
"api": "anthropic-messages",
"models": [
{ "id": "claude-sonnet-4-6", "name": "Claude Sonnet 4.6 (naxxen)" },
{ "id": "claude-opus-4-6", "name": "Claude Opus 4.6 (naxxen)" },
{ "id": "claude-haiku-4-5-20251001", "name": "Claude Haiku 4.5 (naxxen)" }
]
}Google via naxxen
{
"name": "naxxen-google",
"baseUrl": "https://api.naxxen.ai/nxn-sk-YOUR_NAXXEN_KEY",
"apiKey": "YOUR_GOOGLE_API_KEY",
"api": "google-generative-ai",
"models": [
{ "id": "gemini-2.5-flash", "name": "Gemini 2.5 Flash (naxxen)" },
{ "id": "gemini-2.5-pro", "name": "Gemini 2.5 Pro (naxxen)" }
]
}How it works
- OpenClaw sends requests to
https://api.naxxen.ai/nxn-sk-YOUR_KEY/v1/chat/completions(or/v1/messages, etc.) - naxxen strips the key from the path, compresses the prompt, and forwards to the real provider
- The response streams back through naxxen to OpenClaw
Your provider API key (apiKey field) is sent in the standard header for that provider and forwarded unchanged.
Verify
After configuring, make a request using one of your naxxen providers. Then check your dashboard — you should see the request with compression stats.
Set up with an AI agent
Copy this prompt and give it to your AI coding agent:
Read https://docs.naxxen.ai/llms.txt and configure naxxen as a proxy
for my OpenAI, Anthropic, and Google API calls in OpenClaw.
My naxxen key is nxn-sk-YOUR_KEY.
Add naxxen-openai, naxxen-anthropic, and naxxen-google providers
to my ~/.openclaw/openclaw.json config.