[BUG]Can I change default OpenAI token from gpt-3.5turbo to gpt-4o-mini?
Discussed in https://github.com/orgs/infiniflow/discussions/2699
Originally posted by nstuPlyaskin October 1, 2024 Hello there, thanks so much for making this powerful tool!
I have a problem when I tried to add OpenAI integration via token from website (Model Providers), I select OpenAI, put gpt-4o-mini token, and I see an error:
hint : 102
Fail to access model(gpt-3.5-turbo) using this api key.ERROR: Error code: 403 - {'error': {'message': ... does not have access to model gpt-3.5-turbo', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"
How can I fix that to use my token for gpt-4o-mini?
Hi, You can use OpenAI API Compatible to set up the gpt-4o-mini model instead of OpenAI, which allows you to specify the model