Translation Proxy
The translation proxy is the core of Claudex. It sits between Claude Code and your AI providers, transparently converting between the Anthropic Messages API and the OpenAI Chat Completions API.
How It Works
Section titled “How It Works”Claude Code → Anthropic Messages API request │ └── Claudex Proxy (127.0.0.1:13456) │ ├── DirectAnthropic provider → forward with headers │ ├── OpenAICompatible provider │ ├── Translate request: Anthropic → OpenAI Chat Completions │ ├── Forward to provider │ └── Translate response: OpenAI → Anthropic │ └── OpenAIResponses provider ├── Translate request: Anthropic → OpenAI Responses API ├── Forward to provider └── Translate response: Responses → AnthropicWhat Gets Translated
Section titled “What Gets Translated”Request Translation (Anthropic → OpenAI)
Section titled “Request Translation (Anthropic → OpenAI)”| Anthropic | OpenAI |
|---|---|
system field | System message in messages array |
messages[].content blocks (text, image, tool_use) | messages[].content + tool_calls |
tools array (JSON Schema) | tools array (function format) |
tool_choice | tool_choice |
max_tokens | max_tokens |
temperature, top_p | Direct mapping |
Response Translation (OpenAI → Anthropic)
Section titled “Response Translation (OpenAI → Anthropic)”| OpenAI | Anthropic |
|---|---|
choices[0].message.content | content blocks |
choices[0].message.tool_calls | tool_use content blocks |
finish_reason: stop | stop_reason: end_turn |
finish_reason: tool_calls | stop_reason: tool_use |
usage.prompt_tokens / completion_tokens | usage.input_tokens / output_tokens |
Streaming Translation
Section titled “Streaming Translation”Claudex fully supports SSE (Server-Sent Events) streaming, translating OpenAI stream chunks into Anthropic stream events in real time:
| OpenAI SSE | Anthropic SSE |
|---|---|
| First chunk | message_start + content_block_start |
choices[0].delta.content | content_block_delta (text_delta) |
choices[0].delta.tool_calls | content_block_delta (input_json_delta) |
finish_reason present | content_block_stop + message_delta + message_stop |
The streaming translator maintains a state machine to properly handle tool call accumulation and content block boundaries.
Supported Providers
Section titled “Supported Providers”| Provider | Type | Base URL |
|---|---|---|
| Anthropic | DirectAnthropic | https://api.anthropic.com |
| MiniMax | DirectAnthropic | https://api.minimax.io/anthropic |
| OpenRouter | OpenAICompatible | https://openrouter.ai/api/v1 |
| Grok (xAI) | OpenAICompatible | https://api.x.ai/v1 |
| OpenAI | OpenAICompatible | https://api.openai.com/v1 |
| DeepSeek | OpenAICompatible | https://api.deepseek.com |
| Kimi/Moonshot | OpenAICompatible | https://api.moonshot.cn/v1 |
| GLM (Zhipu) | OpenAICompatible | https://open.bigmodel.cn/api/paas/v4 |
| Ollama | OpenAICompatible | http://localhost:11434/v1 |
| vLLM | OpenAICompatible | http://localhost:8000/v1 |
| ChatGPT/Codex sub | OpenAIResponses | https://chatgpt.com/backend-api/codex |
Proxy Management
Section titled “Proxy Management”# Start proxy as a daemonclaudex proxy start -d
# Check proxy statusclaudex proxy status
# Stop proxy daemonclaudex proxy stop
# Start on a custom portclaudex proxy start -p 8080When you run claudex run <profile>, the proxy is automatically started in the background if not already running.