Lars Grammel

Results 105 comments of Lars Grammel

Interesting - love the store! Not sure how well this suites your use case, and it is not fully integrated with the stream data protocol yet, but in the future...

Update on solutions: - if you use `useChat` or `useCompletion`, it will continue to work. If you see issues, it's most likely caused by a mismatch of the client and...

> Hi @lgrammel Would be great to have some updated documentation on this. I struggle with the AIStream - as it stopped working when moving to 3.0.21. I use useChat....

@martgra If I understand correctly, you are returning custom text chunks. In the new protocol, text chunks are lines that look like this `0:my-chunk\n`. You can use `formatStreamPart` to achieve...

> IIUC it's not possible to use `readDataStream` because it's not exported, is this correct? Thanks - I'll work on getting it exported. Update: https://github.com/vercel/ai/pull/1334 (published in `v3.0.22`)

> @lgrammel may I ask if `useChat` sometime will use the SSE protocol directly? Seems like this text stream is less flexible than actually including more metadata in the stream...

@flexchar I understand that this is frustrating, and I agree it's unfortunate. It was not an intentional breakage, and unfortunately it happened during a split into multiple packages, which made...

Update on solutions: - if you use `useChat` or `useCompletion`, it will continue to work. If you see issues, it's most likely caused by a mismatch of the client and...

OpenAI automatically stores the assistant messages (the OpenAI assistant API is stateful). You can retrieve them with their API: https://platform.openai.com/docs/api-reference/messages/getMessage Would that help with your use case?

Thanks! With our docs changes there were some merge conflicts, and I needed to make changes. Superseeded by #1615