-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
What version of Codex is running?
0.1.2505172129
Which model were you using?
No response
What platform is your computer?
Darwin 24.4.0 arm64 arm
What steps can reproduce the bug?
I would expect to be able able to set default model per provider in config file.
Something like this:
{
"provider": "ollama",
"approvalMode": "full-auto",
"fullAutoErrorMode": "ask-user",
"notify": true,
"providers": {
"openai": {
"name": "OpenAI",
"baseURL": "https://api.openai.com/v1",
"envKey": "OPENAI_API_KEY",
"model": "o4-mini"
},
"ollama": {
"name": "Ollama",
"baseURL": "http://localhost:11434/v1",
"envKey": "OLLAMA_API_KEY",
"model": "qwen2.5-coder:14b"
}
}
}
And the expected behaviour would be that by default codex
enables ollama provider. But if specified, codex --provider openai
it would enable o4-mini without additional inputs.
What is the expected behavior?
No response
What do you see instead?
No response
Additional information
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working