Skip to Content
Proxy (Zero Code)

Proxy (Zero Code Changes)

@neurameter/proxy is an OpenAI-compatible proxy server. Point your app at it — no code changes needed.

Installation

npm install -g @neurameter/proxy

Usage

neurameter-proxy \ --api-key nm_xxx \ --project proj_xxx \ --port 3100

Then set the base URL for your app:

OPENAI_BASE_URL=http://localhost:3100/v1 node app.js

Options

FlagEnv VarDefaultDescription
--api-keyNEURAMETER_API_KEYNeuraMeter API key (required)
--projectNEURAMETER_PROJECT_IDProject ID (required)
--portNEURAMETER_PROXY_PORT3100Listen port
--targetNEURAMETER_PROXY_TARGEThttps://api.openai.comUpstream API URL
--agent-nameproxy-agentAgent name for events
--endpointNEURAMETER_ENDPOINThttps://meter.neuria.techIngestion API URL

How It Works

  1. Proxy receives POST /v1/chat/completions from your app
  2. Forwards the request to the upstream OpenAI API (or compatible)
  3. For streaming, auto-injects stream_options: { include_usage: true }
  4. Extracts token usage from the response
  5. Calculates cost using built-in pricing tables
  6. Sends a tracking event to NeuraMeter (async, never blocks the response)
  7. Returns the upstream response unchanged

Supported Endpoints

EndpointMethodDescription
/v1/chat/completionsPOSTChat completions (tracked)
/v1/modelsGETModel list (passthrough)
/healthGETHealth check

When to Use Proxy vs SDK

SDK (withNeuraMeter)Proxy
Code changes2 linesNone
Streaming supportYesYes
Cached token trackingYesYes
Custom agent namesPer-wrapperPer-proxy instance
Multiple providersOpenAI onlyOpenAI-compatible only
Latency overhead<1msNetwork hop to proxy

Recommendation: Use withNeuraMeter() when possible. Use the proxy when you can’t modify application code or when the app doesn’t support OPENAI_BASE_URL.