deep-chat
deep-chat copied to clipboard
How to handle blob responses from streaming api
I've been working on a project where I need to handle a blob response from a streaming API, and I followed your previous suggestion to use the deep-chat-dev(9.0.254) Package. Here's a snippet of how I’ve integrated it:
<deep-chat #elementRef [introMessage]='introMessage' [responseInterceptor]="responseInterceptor" style=" border-radius: 10px; width: 100%; height: 70vh; max-height: 100%; padding-top: 10px; font-family: 'Plus Jakarta Sans'; font-size: 0.9rem; padding-top: 10px; background: #faf8f8; box-shadow: inset 0 0 10px rgba(0, 0, 0, 0.1);" [textInput]='textInput' [messageStyles]='messageStyles' [avatars]='avatars' [connect]='chatRequest' [submitButtonStyles]='submitButtonStyles' [errorMessages]="errorMessages">
this.chatRequest = { "url": this.appService.basePath + "api/providers/cloudlyte/v1/chat/completions", "method": "POST", stream: true }
this.responseInterceptor = (response: any) => { return { text: response?.choices[0]?.delta?.content }; }
The issue I'm facing is that deep-chat waits until the entire streaming response is received before rendering the message, rather than rendering it progressively like ChatGPT does.
I noticed that using stream: { simulation: 30 } with normal responses simulates real-time output effectively. However, if I combine it with stream: { readable: true, simulation: 30 }, for blob response it throws an error.
The chunk response will be like,
Could you please advise on the correct way to handle a real-time streaming blob response and render it progressively in deep-chat? I’d like the messages to appear incrementally as the data arrives, instead of waiting for the full response.