Using Codex CLI with OpenRouter Instead of OpenAI
Yes — you can use OpenRouter with Codex CLI, but not via the built-in “login/auth” flow. You authenticate differently.
Two Separate Auth Paths in Codex CLI
1. Native Codex Login (OpenAI)
- Command:
codex --login - Uses ChatGPT/OpenAI account
- Auto-generates API key
This is the only “official” login/auth flow.
2. OpenRouter (What You’re Looking For)
- No OAuth / login flow
- Uses API key + config file
How OpenRouter Auth Actually Works
Instead of logging in, you:
1. Get API Key from OpenRouter
Format: sk-or-...
2. Set Environment Variable
export OPENROUTER_API_KEY="sk-or-..."
3. Configure Codex (~/.codex/config.toml)
model_provider = "openrouter"
model = "openai/gpt-5.3-codex"
[model_providers.openrouter]
name = "openrouter"
base_url = "https://openrouter.ai/api/v1"
env_key = "OPENROUTER_API_KEY"
4. Run Codex
codex
At this point, Codex will send requests through OpenRouter.
Key Limitation
- OpenRouter = API-key-based only
- Codex CLI login = OpenAI-only
So:
You cannot “authenticate OpenRouter” using
codex --login
Practical Takeaway
- If you want easy login + credits → use OpenAI auth
- If you want multi-model routing / cheaper models → use OpenRouter config