Skip to main content
Codex CLI provides powerful code generation and completion capabilities directly in your terminal.
If your Allowed Headers are already set to *, you can skip this note. If not, and you face issues integrating Bifrost with Codex CLI, try switching to * or adding the specific headers required by your client. By default, Bifrost whitelists: Content-Type, Authorization, X-Requested-With, X-Stainless-Timeout, and X-Api-Key.

Installing Codex CLI

npm install -g @openai/codex

Configuring Codex CLI with Bifrost

Codex CLI always prefers OAuth over custom API keys. Make sure you run /logout before configuring the Bifrost gateway with Codex.

Update codex.toml

Add the Bifrost base URL and credentials to your global ~/.codex/config.toml or project-specific .codex/config.toml:
export OPENAI_API_KEY=<bifrost_virtual_key>
openai_base_url="http://localhost:8080/openai/v1"
env_key="OPENAI_API_KEY"
model = "openai/gpt-5.4"
Always run codex from the same terminal session where you exported variables, or restart the terminal after changing your profile. GUI-launched terminals or IDEs may not pick up shell-profile exports unless the environment is configured there as well.

Model Configuration

Use the --model flag to start Codex with a specific model:
codex --model openai/gpt-5-codex
codex --model openai/gpt-5.4-pro
You can also switch models mid-session with the /model command:
/model openai/gpt-5.4-pro
/model openai/gpt-5-codex

Using Non-OpenAI Models with Codex CLI

Bifrost automatically translates OpenAI API requests to other providers, so you can use Codex CLI with models from Anthropic, Google, Mistral, and more. Use the provider/model-name format to specify any Bifrost-configured model:
# Start with an Anthropic model
codex --model anthropic/claude-sonnet-4-5-20250929

# Start with a Google model
codex --model gemini/gemini-2.5-pro

# Switch mid-session
/model anthropic/claude-sonnet-4-5-20250929
/model mistral/mistral-large-latest

Supported Providers

Bifrost supports the following providers with the provider/model-name format: openai, azure, gemini, vertex, bedrock, mistral, groq, cerebras, cohere, perplexity, xai, ollama, openrouter, huggingface, nebius, parasail, replicate, vllm, sgl
Non-OpenAI models must support tool use for Codex CLI to work properly. Codex CLI relies on tool calling for file operations, terminal commands, and code editing. Models without tool use support will fail on most operations.