openai icon indicating copy to clipboard operation
openai copied to clipboard

Feature Request: Add a Parameter to Return Messages as Strings Instead of Objects

Open sheng-di opened this issue 1 year ago • 2 comments

Hello @anasfik,

Firstly, I want to express my appreciation for your work on the 'openai' project. It's been incredibly useful and well implemented.

I am writing to suggest a feature that could enhance the adaptability of the project with different GPT proxies. Currently, the 'messages' field in the request body is an array of objects, each containing 'type' and 'text' fields, like so:

{
  "model": "gpt-4",
  "messages": [
    {
      "role": "system",
      "content": [
        { "type": "text", "text": "return any message you are given as JSON." }
      ]
    },
    {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "Hello, I am a chatbot created by OpenAI. How are you today?"
        }
      ]
    }
  ]
}

However, some GPT proxies only support strings and not objects. Therefore, it would be beneficial if we could have an option to set a parameter that changes the 'messages' field to only contain strings. For example, by setting OpenAI.onlyString = true;, the 'messages' content would be transformed into:

{
  "model": "gpt-4",
  "messages": [
    {
      "role": "system",
      "content": [
        "return any message you are given as JSON."
      ]
    },
    {
      "role": "user",
      "content": [
        "Hello, I am a chatbot created by OpenAI. How are you today?"
      ]
    }
  ]
}

Implementing this feature would increase the project's compatibility with various GPT proxies, making it more versatile and user-friendly. I believe this would be a valuable addition to the project.

Thank you for considering this suggestion. I look forward to your feedback.

sheng-di avatar Mar 16 '24 05:03 sheng-di

According to https://platform.openai.com/docs/api-reference/making-requests: image

sheng-di avatar Mar 16 '24 05:03 sheng-di

The package is made to reflect the OpenAI APIs, which I take to be my responsibility on this project.

rather than having this built in, have a custom proxying of the stream of completion, the map() on a dart Stream is what you need here to make the original response adapt to GPT proxies.

anasfik avatar May 08 '24 18:05 anasfik