na{xx}en

Cursor

Configure Cursor to route LLM requests through naxxen.

Cursor supports custom OpenAI-compatible endpoints. You can route requests through naxxen by changing the base URL in Cursor's settings.

Configuration

  1. Open Cursor Settings (Cmd+, / Ctrl+,)
  2. Go to ModelsOpenAI API Key
  3. Set the Base URL to your naxxen endpoint:
https://api.naxxen.ai/nxn-sk-YOUR_NAXXEN_KEY
  1. Enter your provider API key (e.g., OpenAI sk-proj-...) in the API key field
  2. Select your model (e.g., gpt-4o)

Cursor will send requests to naxxen, which compresses the prompt and forwards to OpenAI.

For Anthropic models in Cursor

If Cursor supports Anthropic models natively, you can set a custom Anthropic base URL the same way:

https://api.naxxen.ai/nxn-sk-YOUR_NAXXEN_KEY

With your Anthropic API key in the key field. naxxen auto-detects the provider from the request format.

Alternative: Composite Bearer

If Cursor only lets you set one API key field (no separate base URL), use the composite method:

  • API Key: nxn-sk-YOUR_NAXXEN_KEY:sk-YOUR_PROVIDER_KEY
  • Base URL: https://api.naxxen.ai

naxxen splits the key on : — the left side is your naxxen key, the right side is forwarded to the provider.

Verify

After configuring, use Cursor to generate some code. Then check your dashboard to see the request with compression stats.

Set up with an AI agent

Copy this prompt and give it to your AI coding agent:

Read https://docs.naxxen.ai/llms.txt and configure Cursor to use
naxxen for prompt compression. My naxxen key is nxn-sk-YOUR_KEY.
Update Cursor's model settings to use the naxxen base URL.