fix: Update OpenAIApi.cs for chunks incomplete bug
when I received the messages by stream or chunks sometimes I received the last message incomplete, the reason was that the request.isDone() was set to false and finished receiving messages before receiving the message of finisih_reason of the OpenAI API, then I modified the logic so that it only closes the cycle when I get the stop in finish_reason that is the flag that warns me that it finished sending me chunks.
Solved Issues:
https://github.com/srcnalt/OpenAI-Unity/issues/59 https://github.com/srcnalt/OpenAI-Unity/issues/50 https://github.com/srcnalt/OpenAI-Unity/issues/81
Edit: In fact, technically you don't even need to use isDone at all because the loop will break if the value contains [DONE] but better safe than sorry I guess?
Original: I'd recommend this instead:
request.SendWebRequest();
var isDone = false;
do
{
List<T> dataList = new();
var lines = request.downloadHandler.text.Split('\n').Where(line => line != "").ToArray();
foreach (var line in lines)
{
var value = line.Replace("data:", "");
Debug.Log(value);
if (value.Contains("[DONE]"))
{
isDone = true;
onComplete?.Invoke();
break;
}
var data = JsonConvert.DeserializeObject<T>(value);
if (data?.Error != null)
{
var error = data.Error;
Debug.LogError($"Error message: {error.Message}\nError type: {error.Type}\n");
}
else
{
dataList.Add(data);
}
}
onResponse?.Invoke(dataList);
await Task.Yield();
} while (!isDone);
@srcnalt It would be greatly appreciated if this pull request could be merged so that the stream feature actually works without local changes.
@srcnalt please merge this pull request.
Just tested these changes in local, it works also without changing line 126. Letting the value.Contains("[DONE]") as is.