litellm.BadRequestError: DeepseekException
An error occurred during question recommendation generation: litellm.BadRequestError: DeepseekException - Failed to deserialize the JSON body into the target type: response_format: response_format.type json_schema is unavailable now at line 1 column 16335
@wuhq7 please check this config example for deepseek
https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.deepseek.yaml
@wuhq7 please check this config example for deepseek
https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.deepseek.yaml
Initially I followed this config.deepseek.yaml example, then I followed the latest version of yaml you posted
@wuhq7 please check this config example for deepseek https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.deepseek.yaml
Initially I followed this config.deepseek.yaml example, then I followed the latest version of yaml you posted
The latest version of config.example.yaml is only used for making sure pipe definitions are the latest. llm, embedding model definitions please follow deepseek config