langchainjs
langchainjs copied to clipboard
how to handle stream when use agent?
when use agent, handleLLMNewToken will receive json,how can i receive the real output from stream?
Hi @mengjian-github can you let me know more about your use case, what do you want to take from the original output of the openai stream?
@nfcampos I want to create a chatbot and would like to leverage the abilities of tools, but I have found that I cannot use the streamable API. Waiting for a response takes too long, and using stream rendering like real-time chat tools would provide a better user experience.
I want to get the real text after "Final Answer: ",but there is no such streaming callback to handle now
cc @agola11
Thanks for bringing this up @mengjian-github -- this is something we are actively looking to improve. In the meantime, this isn't ideal, but you might be able to get away with saving some state of the previously seen tokens and only start streaming to a client once the previously seen token string ends with "Final Answer: "
Will keep this issue open as we investigate a better solution
I also faced this issue and opened #1048 sometime ago. So i had to disable streaming for my telegram bot
Anyone managed to stream the final output of an Agent ?
Anyone managed to stream the final output of an Agent ?
I was able to
let tempTokens = "";
...
{
handleLLMNewtoken(token){
if(tempTokens.includes("Final Answer:")){
console.log(token); //write to stream
} else {
tempTokens += token;
}
}
}
Then when agent ends, set tempTokens
back to an empty string, or the logic will break for future calls.
executor.call().then(() => {
tempTokens = "";
})
Anyone managed to stream the final output of an Agent ?
I was able to
const tempTokens = ""; ... { handleLLMNewtoken(token){ if(tempTokens.includes("Final Answer:")){ console.log(token); //write to stream } else { tempToken += token; } } }
Then when agent ends, set
tempTokens
back to an empty string, or the logic will break for future calls.executor.call().then(() => { tempTokens = ""; })
I thought you can't mutate a const
value, so how did it work for you?? Also, where did you define tempTokens
?
Anyone managed to stream the final output of an Agent ?
I was able to
const tempTokens = ""; ... { handleLLMNewtoken(token){ if(tempTokens.includes("Final Answer:")){ console.log(token); //write to stream } else { tempToken += token; } } }
Then when agent ends, set
tempTokens
back to an empty string, or the logic will break for future calls.executor.call().then(() => { tempTokens = ""; })
I thought you can't mutate a
const
value, so how did it work for you?? Also, where did you definetempTokens
?
My bad that should have been a let
. It's a nextjs api route in the app directory, placed it above the handler function.
Alright! I'll give it a try a see how it fairs
There has any progress? 🥹 @agola11
Any updates?
Also waiting for updates here, would be nice to know if something can be done.
Hi, @mengjian-github
I'm helping the langchainjs team manage their backlog and am marking this issue as stale. From what I understand, you are seeking guidance on receiving the actual output from a stream, which is currently being received as JSON in the handleLLMNewToken function. There have been suggestions and workarounds from other users, as well as inquiries about progress and updates, indicating ongoing interest in resolving the issue.
Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!