o1-mini model does not support "system" message
Describe the bug
An error message is generated regarding the o1 and o1-mini models not supporting the system prompt that is provided by default.
To Reproduce Steps to reproduce the behavior:
- Run any of the commands
neworone-shotwith any standard prompt for ChatGPT using theo1oro1-minimodel- For example:
terminal-gpt --model o1 one-shot "Hello!"orterminal-gpt --model o1-mini one-shot "Hello!"
- For example:
Expected behavior User should begin conversation with TerminalGPT as per documentation.
Screenshots
Development Environment:
- Docker Image (bash):
Python:3.9.16
Additional Comments and Resolution Intent I would like to provide support regarding this bug.
I believe it could be resolved with additional handling for the reasoning models.
Thanks @Broken-Admin, I'm definitely going to release a fix for this as soon as possible.
I checked, and this is reproducible only for the o1-mini model, and not for the o1.
Confirmed. PR has received correcting commits.
It has been sent out through the OpenAI email that o1-mini will be depreciated.
This issue was resolved by PR #160