Proxy (Zero Code Changes)
@neurameter/proxy is an OpenAI-compatible proxy server. Point your app at it — no code changes needed.
Installation
npm install -g @neurameter/proxyUsage
neurameter-proxy \
--api-key nm_xxx \
--project proj_xxx \
--port 3100Then set the base URL for your app:
OPENAI_BASE_URL=http://localhost:3100/v1 node app.jsOptions
| Flag | Env Var | Default | Description |
|---|---|---|---|
--api-key | NEURAMETER_API_KEY | — | NeuraMeter API key (required) |
--project | NEURAMETER_PROJECT_ID | — | Project ID (required) |
--port | NEURAMETER_PROXY_PORT | 3100 | Listen port |
--target | NEURAMETER_PROXY_TARGET | https://api.openai.com | Upstream API URL |
--agent-name | — | proxy-agent | Agent name for events |
--endpoint | NEURAMETER_ENDPOINT | https://meter.neuria.tech | Ingestion API URL |
How It Works
- Proxy receives
POST /v1/chat/completionsfrom your app - Forwards the request to the upstream OpenAI API (or compatible)
- For streaming, auto-injects
stream_options: { include_usage: true } - Extracts token usage from the response
- Calculates cost using built-in pricing tables
- Sends a tracking event to NeuraMeter (async, never blocks the response)
- Returns the upstream response unchanged
Supported Endpoints
| Endpoint | Method | Description |
|---|---|---|
/v1/chat/completions | POST | Chat completions (tracked) |
/v1/models | GET | Model list (passthrough) |
/health | GET | Health check |
When to Use Proxy vs SDK
SDK (withNeuraMeter) | Proxy | |
|---|---|---|
| Code changes | 2 lines | None |
| Streaming support | Yes | Yes |
| Cached token tracking | Yes | Yes |
| Custom agent names | Per-wrapper | Per-proxy instance |
| Multiple providers | OpenAI only | OpenAI-compatible only |
| Latency overhead | <1ms | Network hop to proxy |
Recommendation: Use withNeuraMeter() when possible. Use the proxy when you can’t modify application code or when the app doesn’t support OPENAI_BASE_URL.