!!! question "Since sing-box 1.13.0"
OCM (OpenAI Codex Multiplexer) service is a multiplexing service that allows you to access your local OpenAI Codex subscription remotely through custom tokens.
It handles OAuth authentication with OpenAI's API on your local machine while allowing remote clients to authenticate using custom tokens.
{
"type": "ocm",
... // Listen Fields
"credential_path": "",
"usages_path": "",
"users": [],
"headers": {},
"detour": "",
"tls": {}
}
See Listen Fields for details.
Path to the OpenAI OAuth credentials file.
If not specified, defaults to ~/.codex/auth.json.
Refreshed tokens are automatically written back to the same location.
Path to the file for storing aggregated API usage statistics.
Usage tracking is disabled if not specified.
When enabled, the service tracks and saves comprehensive statistics including:
Statistics are organized by model and optionally by user when authentication is enabled.
The statistics file is automatically saved every minute and upon service shutdown.
List of authorized users for token authentication.
If empty, no authentication is required.
Object format:
{
"name": "",
"token": ""
}
Object fields:
name: Username identifier for tracking purposes.token: Bearer token for authentication. Clients authenticate by setting the Authorization: Bearer <token> header.Custom HTTP headers to send to the OpenAI API.
These headers will override any existing headers with the same name.
Outbound tag for connecting to the OpenAI API.
TLS configuration, see TLS.
{
"services": [
{
"type": "ocm",
"listen": "127.0.0.1",
"listen_port": 8080
}
]
}
Add to ~/.codex/config.toml:
[model_providers.ocm]
name = "OCM Proxy"
base_url = "http://127.0.0.1:8080/v1"
wire_api = "responses"
requires_openai_auth = false
Then run:
codex --model-provider ocm
{
"services": [
{
"type": "ocm",
"listen": "0.0.0.0",
"listen_port": 8080,
"usages_path": "./codex-usages.json",
"users": [
{
"name": "alice",
"token": "sk-alice-secret-token"
},
{
"name": "bob",
"token": "sk-bob-secret-token"
}
]
}
]
}
Add to ~/.codex/config.toml:
[model_providers.ocm]
name = "OCM Proxy"
base_url = "http://127.0.0.1:8080/v1"
wire_api = "responses"
requires_openai_auth = false
experimental_bearer_token = "sk-alice-secret-token"
Then run:
codex --model-provider ocm