maid icon indicating copy to clipboard operation
maid copied to clipboard

Getting Unwanted error when using openai compatible api

Open esconer opened this issue 1 year ago • 1 comments

Hello,

I commented in issue #663. Thanks for the implementation now I can use different model just using model names without fetching it.

But I'm facing error while using the models When I'm using together ai api I'm getting the following error

type 'List<dynamic>' is not a subtype of type 'Map<String, dynamic>'

After that I tried using openRouter

Invalid argument(s): A value must be provided. Supported values: list

In both cases I can chat with models fine but getting a error message every time sent a request is little frustrating.

Thanks for making this awesome app ⭐

esconer avatar Mar 25 '25 17:03 esconer

Openrouter actually does support model lists (https://openrouter.ai/docs/api-reference/list-available-models) However, I'm getting two popups (one on top of another, I have to consecutively close both each time) of Invalid argument(s): A value must be provided. Supported values: list both in the settings (where I set the model manually because the list fails to load with this error) and after each message sent in chat.

0xCA avatar Apr 13 '25 04:04 0xCA

I can confirm the same behavior in connection with together.ai. When I set the model manually, I can chat with the assistant, but I always receive error messages.

Testorakel avatar Jun 08 '25 12:06 Testorakel

The same problem occurs when using openwebui api (docker selfhosted page). The app cannot retrieve model list, but when model gets specified manually I can chat with the model. During the response the error 'Invalid argument(s): A value must be provided. Supported values: list' pops up, but otherwise it functions well.

3evv avatar Jul 26 '25 20:07 3evv

This PR fixes this issue.

Image

davidmigloz avatar Jul 28 '25 13:07 davidmigloz