openai new models dont work
The new models give max_temperature unknown error and cannot be used.
temperature?
my bad, i meant tokens -- ^^^^^^^^^^^^^^^^^^^^
File "E:\MetaStocky\env\Lib\site-packages\openai\_base_client.py", line 1625, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}}
llm: api_type: "openai" # or azure / ollama / groq etc. model: "o3-mini" # or gpt-3.5-turbo base_url: "https://api.openai.com/v1" # or forward url / other llm url api_key: "sk-proj-
https://github.com/geekan/MetaGPT/pull/1710
This issue has no activity in the past 30 days. Please comment on the issue if you have anything to add.
This issue was closed due to 45 days of inactivity. If you feel this issue is still relevant, please reopen the issue to continue the discussion.