[Bug]: O1 models don't really work
What happened?
I love fabric, it's really very useful. Thank you .
I'm trying to use the o1-preview and o1-mini models from OpenAI, but they don't support system messages, nor do they support any variations of the flags.
It is possible to use these models, but only just. I can make the following work:
copy < patterns/qq/system.md
fabric -m "o1-preview" -t 1 -T 1 -P 0 -F 0 "$(paste) ... Output in markdown format only. write 3 simple bash code blocks that show how to output to the console, and explain each."
(this is one of my own patterns)
if I omit the flags it breaks. If I change those values, it breaks.
this model has beta-limitations, temperature, top_p and n are fixed at 1, while presence_penalty and frequency_penalty are fixed at 0
If I try to pass in a pattern, I get the following error:
this model has beta-limitations, user and assistant messages only, system messages are not supported
Which is why the working example pastes the pattern in verbatim to the query string.
I'd really like to use these models, even in the beta condition they're currently in. Perhaps we need a flag that forces the contents of system into the user prompt instead?
Version check
- [x] Yes I was.
Relevant log output
Relevant screenshots (optional)
No response
Now that I've re-read this, I'm not sure it's a bug in fabric. That it can't easily work with beta models isn't really unexpected behavior on behalf of fabric.
I believe there's some change in the processing connected to o1 models. I'm using Azure OpenAI and while eg. gpt-4o works, o1 and o1-mini accessed via same endpoint url do not:
❯ fabric -m o1 -p ai "have a beer?"
error, status code: 400, status: 400 Bad Request, message: Model {modelName} is enabled only for api versions 2024-12-01-preview and later
Looking into the code, I see the Fabric is making use of https://github.com/sashabaranov/go-openai project for Azure OpenAI connections. Maybe there's a need for the config.APIVersion parameter customization?
I get this error when using o1 or o3-mini:
error, status code: 400, status: 400 Bad Request, message: Unsupported parameter: 'temperature' is not supported with this model.
❯ fabric -m o1 -p ai "have a beer?"
I don't see o1 in the list of models that is returned by fabric.
fabric -L | grep -oP ']\s+\K.*' | sort | nl -n rz -v1 -w2 -s'. '
- babbage-002
- chatgpt-4o-latest
- dall-e-2
- dall-e-3
- davinci-002
- gpt-3.5-turbo
- gpt-3.5-turbo-0125
- gpt-3.5-turbo-1106
- gpt-3.5-turbo-16k
- gpt-3.5-turbo-instruct
- gpt-3.5-turbo-instruct-0914
- gpt-4
- gpt-4-0125-preview
- gpt-4-0613
- gpt-4-1106-preview
- gpt-4o
- gpt-4o-2024-05-13
- gpt-4o-2024-08-06
- gpt-4o-2024-11-20
- gpt-4o-audio-preview
- gpt-4o-audio-preview-2024-10-01
- gpt-4o-audio-preview-2024-12-17
- gpt-4o-mini
- gpt-4o-mini-2024-07-18
- gpt-4o-mini-audio-preview
- gpt-4o-mini-audio-preview-2024-12-17
- gpt-4o-mini-realtime-preview
- gpt-4o-mini-realtime-preview-2024-12-17
- gpt-4o-realtime-preview
- gpt-4o-realtime-preview-2024-10-01
- gpt-4o-realtime-preview-2024-12-17
- gpt-4-turbo
- gpt-4-turbo-2024-04-09
- gpt-4-turbo-preview
- o1-mini
- o1-mini-2024-09-12
- o1-preview
- o1-preview-2024-09-12
- omni-moderation-2024-09-26
- omni-moderation-latest
- text-embedding-3-large
- text-embedding-3-small
- text-embedding-ada-002
- tts-1
- tts-1-1106
- tts-1-hd
- tts-1-hd-1106
- whisper-1
I don't see o1 in the list of models that is returned by fabric.
It's individual configuration. Mine is there because I have labeled that model deployment so in my Azure OpenAI service, which I then configured in fabric. I suspect reasoning models need some extra handling in the integration. Project Aider explain a bit here: https://aider.chat/docs/config/reasoning.html
For the o model you'll need to use --raw, which prevents additional model params being set.
./fabric --model 01-mini --raw
For the o model you'll need to use --raw, which prevents additional model params being set.
./fabric --model 01-mini --raw
I understand that this would disable prompt tuning, losing some of core fabric features perhaps. Also, it does not seem to help generally. It partially works with o1-mini, that is old model version (2024-09-12):
❯ fabric -m o1-mini --raw "say hello"
Hello! 😊 How can I assist you today?
The same is not valid for o1 (2024-12-17) and o3-mini (2025-01-31) though :
❯ fabric -m o1 --raw "say hello"
error, status code: 400, status: 400 Bad Request, message: Model {modelName} is enabled only for api versions 2024-12-01-preview and later
❯ fabric -m o3-mini --raw "say hello"
error, status code: 400, status: 400 Bad Request, message: Model {modelName} is enabled only for api versions 2024-12-01-preview and later
This suggest that some more handling on fabric side might be required for newest OpenAI models.
FWIW using --raw worked fine for me with o3 on Azure today using 2025-01-01-preview as the version
But agreed, it'd be great if it just worked
We've since made fabric know which models need raw mode, so --raw is now rarely needed with the most common models.