Pieter Soudan
Pieter Soudan
Yes, I have. I get errors like: "No object generated: the tool was not called." When I look at the logs in LMStudio, I see that responses are being generated...
I hadn't. I now tried setting the `mode` explicitly to `json`, avoiding leaving it to the provider to infer the mode. This results in a Bad Request (status code `400`)...
after further investigation, the `response_format.type` that is being set by the openAiCompatibility adaptation is `json_object` while LM Studio requires it to be `json_schema`?
I can see that the chatModel needs to be configured with `supportStructuredOutputs` to be set to true (https://github.com/vercel/ai/blob/e462d108de507e45f2fe08eb7ba5686166d44dbc/packages/openai-compatible/src/openai-compatible-chat-language-model.ts#L194) This config can be set when creating the `OpenAICompatibleChatLanguageModel` using its constructor...
Sorry for my late response. If I understand correctly, after creating the openAICompatible provider, you can instantiate a certain model using the model identifier. Since every model has its own...
@lgrammel I can confirm that when setting the supportStructuredOutputs to true, it works as expected. The openAiCompatibility adapter (when also `mode` set to `json`) then sends the schema to adhere...