Export utils to parse AI stream responses on the client
Feature Description
We're using the AI SDK on the server side only, as our client-side code already includes the logic to handle message states and API calls. It would be helpful to have utilities extracted from useChat to parse the stream responses generated by the AI SDK.
Use Case
No response
Additional context
No response
This is great, since we want to document the protocol in the coming weeks. Can you share which exact functionality / functions you would need to access? some of it is exposed but not documented
Our user cases include:
- simple text chat
- tool calling: call search API if needed, then answer based on search result
@wong2 we have started documenting the data stream protocol: https://sdk.vercel.ai/docs/ai-sdk-ui/stream-protocol#data-stream-protocol
The @ai-sdk/ui-utils package exposes utilities for formatting and parsing these stream parts (warning: the api is internal/unstable and might change).
Great!
Can only second @wong2 . I love the server-side stuff because it abstracts away the differences between providers, but the UI part just feels wrong. I would very much prefer simple building blocks that you can use individually, and then, fine, put a useChat on top of them. Right now useChat is complicated because it's the only official way to handle the server responses, and thus it's hard to use for anything non-standard. I don't think the solution is to extend useChat even further, but rather to expose the building blocks so people can handle their use-case on their own.
@lgrammel , thanks for mentioning the ui-utils. I was able to make use of parseDataStreamPart and I hope y'all consider giving first class attention to exposing those client side parsing systems in v5. I am building a CLI to test my NextJS endpoint with bulk examples and see plenty of CLI / Web split UX potential here. Especially for agentic consumption of endpoints. Tag me if y'all have any interest in how I got CLI up on my endpoint.