ai
ai copied to clipboard
Function calls not parsing correctly
I'm noticing that function calls are seemingly intermittently not parsed correctly into their own messages, and instead concatenated towards the end of a response of a regular assisant message body. Here's example of a message:
Once the dragon is inserted and properly scaled, I'll summarize the scene for you, and if there's anything you'd like to adjust, we can proceed from there.
Let's find that dragon!{"function_call": {"name": "search_asset_library", "arguments": "{\n "queries": ["dragon"]\n}"}}
I'm running the same conversation against the raw openai node api (based on this example):
const stream = await openai.chat.completions.create({
model: "gpt-4",
messages,
functions: functions,
stream: true,
});
let writeLine = lineRewriter();
let message = {} as ChatCompletionMessage;
for await (const chunk of stream) {
// console.log("chunk", chunk);
message = messageReducer(message, chunk);
writeLine(message);
}
console.log();
messages.push(message);
And i'm unable to reproduce the behavior -- the function_calls are always parsed correctly into standalone messages.
I'm working on a minimal example and will update the ticket.
Opened up a PR here: https://github.com/vercel/ai/pull/605
@woodbridge have you tried the (experimental) stream data API? it might resolve your issue