[Enhancement] Change response_format type to string to match OpenAI Spec
Description LocalAI accepts response_format as a string per OpenAI specs. response_format - string or null, Optional, Defaults to url The format in which the generated images are returned. Must be one of url or b64_json.
Changed response_format type to string from struct which was just a wrapper over a string to match OpenAI Spec.
This PR fixes # https://github.com/mudler/LocalAI/issues/2299
Notes for Reviewers
Clients that are using the response_format field in the API requests have to be informed of this backward incompatibility due to contract change.
Signed commits
- [x] Yes, I signed my commits.
Deploy Preview for localai canceled.
| Name | Link |
|---|---|
| Latest commit | d1ee0637c67ce71ce75ead35b577f0db68d01c2e |
| Latest deploy log | https://app.netlify.com/sites/localai/deploys/66570b4c23d85b0008f92e6d |
Thanks @prajwalnayak7 for looking at this - the change is almost OK, except that breaks the chat completion endpoint: https://platform.openai.com/docs/api-reference/chat/create#chat-create-response_format
In this case we need or either split the schemas for image generation and chat, or handle that in the structure by having an interface{} and doing type assertion later.
Thanks @prajwalnayak7 for looking at this - the change is almost OK, except that breaks the chat completion endpoint: https://platform.openai.com/docs/api-reference/chat/create#chat-create-response_format
In this case we need or either split the schemas for image generation and chat, or handle that in the structure by having an
interface{}and doing type assertion later.
wired that its different. Have updated the changes with interface approach, maintaining another structure just to accommodate this difference could be cumbersome.