coreutils
coreutils copied to clipboard
ptx: Implement context_regex (-S) and its default values
The ptx --sentence-regexp=regexp option is not implemented and does not use default values defined by gnu/ptx documentation.
The possible default values for this option includes the new line char, so FileContent should change to not use lines function of BufReader, once \n is not present in line str.
Fixed in latest release.
Did this also fix expert model?
Error executing code: Error code: 400 - {'error': {'message': "Unsupported parameter: │
│ 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': │
│ 'temperature', 'code': 'unsupported_parameter'}}
Did this also fix expert model?
Error executing code: Error code: 400 - {'error': {'message': "Unsupported parameter: │ │ 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': │ │ 'temperature', 'code': 'unsupported_parameter'}}
Yes it should have. I refactored model_tokens to be model_params, and added a supports_temperature param. So it should be doing it correctly on a model basis now.
The logic in llm.py is a bit chaotic, but all the tests pass and things seem to be working properly.
With this new change, we can also add new params to models, e.g. things like supports_reasoning_effort and whatever else we need.
hey i have to have one question from all reasoning model of ["o3-mini", "o1", "o1-mini","o1-preview"] they do not support temperature parameter right ?
hey i have to have one question from all reasoning model of ["o3-mini", "o1", "o1-mini","o1-preview"] they do not support temperature parameter right ?
Correct
I’m still getting the ‘ Sorry, OpenAI has rejected your request. Here is the error message from OpenAI: Unsupported parameter: 'temperature' is not supported with this model.’ with o3 mini
I’m still getting the ‘ Sorry, OpenAI has rejected your request. Here is the error message from OpenAI: Unsupported parameter: 'temperature' is not supported with this model.’ with o3 mini
What version/commit are you using?
ra-aid --version
iPhone, mobile browser, going to https://www.typingmind.com/
(iPhone 14 pro, iOS 18.3)
Typingmind is on The latest version, as it asked me to refresh to get updated version’, which I did, but still same issue.
Please let me me know if there is any other specific info you need. Thanks.
now it supports temperature=1
Thanks. Just so I know, could I have used it by by passing temperature? If so how? Cheers. Sent from my iPhoneOn 7 Feb 2025, at 16:28, Diogo @.***> wrote: now it supports temperature=1
—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>
Hello again.
I'm still getting this error for o1 and o3 Mini -
"Sorry, OpenAI has rejected your request. Here is the error message from OpenAI: Unsupported parameter: 'temperature' is not supported with this model."
Getting this in on computer (Mac) and phone (iphone 14).
Please help.
Reopened. We're about to have a new release (likely today). Once that release comes out, it will be worth another try and then we can dig deeper.
Ok, cheers.
Ok, cheers.
New release is out, please try that one and let me know if you still see this bug.
Hi, I am currently trying to use the latest version 8.6.2 to query the assistant in the o3-mini model in a thread (RunResponse run = await thread.CreateRunAsync(assistant, StreamEventHandler)); and I also get the error "Unsupported parameter: 'temperature' is not supported with this model."
Hi, I am currently trying to use the latest version 8.6.2 to query the assistant in the o3-mini model in a thread (RunResponse run = await thread.CreateRunAsync(assistant, StreamEventHandler)); and I also get the error "Unsupported parameter: 'temperature' is not supported with this model."
8.6.2? What is that the version of? We're on v0.18.0