generative-ai-js icon indicating copy to clipboard operation
generative-ai-js copied to clipboard

Add support for callbacks on streamed response

Open asyncmacro opened this issue 11 months ago • 3 comments

Description of the feature request:

Add callbacks on streamed response to allow developers to easily manage generating streamed chat responses with callbacks such as saving the prompt to database or any server side call.

What problem are you trying to solve with this feature?

This feature is helpful for a good DX. Currently the user will need to manage the response manually. For example I had to implement this function to get a streamed response with callbacks:

export const getStreamedAIResponse = async ({ prompt, onData, onEnd }) => {
  const response = await model.generateContentStream([prompt]);

  let textBuf = '';

  while (true) {
    const { done, value } = await response.stream.next()
    if (done) {
      await onEnd(textBuf);
      return
    }

    if (value) {
      const currentText = value.text()
      onData(value.text())
      textBuf += currentText
    }
  }
}

What this function does is allowing the user to sign a onData and onEnd functions. The server will then call onData on every chunk received from the LLM API and onEnd is called at the end of the generation of the response. This way, It provides the user with a chat-like streamed response without the need to way for the entire response. It also allows the developer to save the response on the end of the generation without relying on any client-side callback.

It would be a good idea to expose the developer with a function like this:

const response = generateStreamedResponse([prompt, {fileData: {fileUri, mimeType}}], {
  onData: (chunk) => {},
  onEnd: (repsonse) => {},
  onError: (error) => {}
})

The returned response is Response which would allow for directly returning the value in frameworks like Next.js and SvelteKit

Any other information you'd like to share?

This could also be beneficial in Express.js and Hono. I am ready to implement it and file a PR.

asyncmacro avatar Jan 05 '25 00:01 asyncmacro

Hey, I would love to work on this issue. My approach was to create a separate utility wrapper rather than modifying the core library files directly. If core library files need to be changed or a hybrid of utility wrapper and core modifications are needed do let me know. Thanks

demoncoder-crypto avatar Mar 08 '25 17:03 demoncoder-crypto

This is now being addressed in PR #401.

cestercian avatar Mar 08 '25 22:03 cestercian

Can you assign me

ChrissHenilston777 avatar Oct 29 '25 21:10 ChrissHenilston777