Getting Unwanted error when using openai compatible api
Hello,
I commented in issue #663. Thanks for the implementation now I can use different model just using model names without fetching it.
But I'm facing error while using the models When I'm using together ai api I'm getting the following error
type 'List<dynamic>' is not a subtype of type 'Map<String, dynamic>'
After that I tried using openRouter
Invalid argument(s): A value must be provided. Supported values: list
In both cases I can chat with models fine but getting a error message every time sent a request is little frustrating.
Thanks for making this awesome app ⭐
Openrouter actually does support model lists (https://openrouter.ai/docs/api-reference/list-available-models)
However, I'm getting two popups (one on top of another, I have to consecutively close both each time) of Invalid argument(s): A value must be provided. Supported values: list both in the settings (where I set the model manually because the list fails to load with this error) and after each message sent in chat.
I can confirm the same behavior in connection with together.ai. When I set the model manually, I can chat with the assistant, but I always receive error messages.
The same problem occurs when using openwebui api (docker selfhosted page). The app cannot retrieve model list, but when model gets specified manually I can chat with the model. During the response the error 'Invalid argument(s): A value must be provided. Supported values: list' pops up, but otherwise it functions well.