generative-ai-js
generative-ai-js copied to clipboard
Feature/streaming callbacks
Feature Request: Add Streaming Callbacks Utility
Currently, developers need to manually manage the response streaming process with custom code like:
const response = await model.generateContentStream([prompt]);
let textBuf = '';
while (true) {
const { done, value } = await response.stream.next()
if (done) {
await onEnd(textBuf);
return
}
if (value) {
const currentText = value.text()
onData(value.text())
textBuf += currentText
}
}
- Process each chunk of text as it arrives (via
onDatacallback) - Perform actions when streaming completes (via
onEndcallback)
Solution
The implementation includes two main utility functions:
getStreamedResponse()- regular content generationgetStreamedChatResponse()- chat specific streaming
Example Usage
await getStreamedResponse({
prompt: "Your prompt here",
model: model,
onData: (chunkText) => {
console.log(chunkText);
},
onEnd: async (fullText) => {
await saveToDatabase(fullText);
});
await getStreamedChatResponse({
message: [{ text: "Your message here" }],
chatSession: chat,
onData: (chunkText) => {
},
onEnd: async (fullText) => {
}
});