langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

how to handle stream when use agent?

Open mengjian-github opened this issue 1 year ago • 5 comments

when use agent, handleLLMNewToken will receive json,how can i receive the real output from stream?

mengjian-github avatar Apr 10 '23 05:04 mengjian-github

Hi @mengjian-github can you let me know more about your use case, what do you want to take from the original output of the openai stream?

nfcampos avatar Apr 10 '23 10:04 nfcampos

@nfcampos I want to create a chatbot and would like to leverage the abilities of tools, but I have found that I cannot use the streamable API. Waiting for a response takes too long, and using stream rendering like real-time chat tools would provide a better user experience.

mengjian-github avatar Apr 10 '23 11:04 mengjian-github

I want to get the real text after "Final Answer: ",but there is no such streaming callback to handle now

mengjian-github avatar Apr 10 '23 11:04 mengjian-github

cc @agola11

nfcampos avatar Apr 10 '23 11:04 nfcampos

Thanks for bringing this up @mengjian-github -- this is something we are actively looking to improve. In the meantime, this isn't ideal, but you might be able to get away with saving some state of the previously seen tokens and only start streaming to a client once the previously seen token string ends with "Final Answer: "

Will keep this issue open as we investigate a better solution

agola11 avatar Apr 10 '23 17:04 agola11

I also faced this issue and opened #1048 sometime ago. So i had to disable streaming for my telegram bot

waptik avatar May 10 '23 19:05 waptik

Anyone managed to stream the final output of an Agent ?

DanielhCarranza avatar Jul 07 '23 06:07 DanielhCarranza

Anyone managed to stream the final output of an Agent ?

I was able to

let tempTokens = "";

...

{
    handleLLMNewtoken(token){
       if(tempTokens.includes("Final Answer:")){
            console.log(token); //write to stream
       } else {
          tempTokens += token;
       }
    }
}

Then when agent ends, set tempTokens back to an empty string, or the logic will break for future calls.

executor.call().then(() => {
 tempTokens = "";
})

Tolu-Mals avatar Jul 20 '23 18:07 Tolu-Mals

Anyone managed to stream the final output of an Agent ?

I was able to

const tempTokens = "";

...

{
    handleLLMNewtoken(token){
       if(tempTokens.includes("Final Answer:")){
            console.log(token); //write to stream
       } else {
          tempToken += token;
       }
    }
}

Then when agent ends, set tempTokens back to an empty string, or the logic will break for future calls.

executor.call().then(() => {
 tempTokens = "";
})

I thought you can't mutate a const value, so how did it work for you?? Also, where did you define tempTokens ?

waptik avatar Jul 20 '23 21:07 waptik

Anyone managed to stream the final output of an Agent ?

I was able to

const tempTokens = "";

...

{
    handleLLMNewtoken(token){
       if(tempTokens.includes("Final Answer:")){
            console.log(token); //write to stream
       } else {
          tempToken += token;
       }
    }
}

Then when agent ends, set tempTokens back to an empty string, or the logic will break for future calls.

executor.call().then(() => {
 tempTokens = "";
})

I thought you can't mutate a const value, so how did it work for you?? Also, where did you define tempTokens ?

My bad that should have been a let. It's a nextjs api route in the app directory, placed it above the handler function.

Tolu-Mals avatar Jul 20 '23 21:07 Tolu-Mals

Alright! I'll give it a try a see how it fairs

waptik avatar Jul 21 '23 06:07 waptik

There has any progress? 🥹 @agola11

Penggeor avatar Aug 14 '23 09:08 Penggeor

Any updates?

Robert-ZLF avatar Sep 18 '23 08:09 Robert-ZLF

Also waiting for updates here, would be nice to know if something can be done.

MichelFR avatar Sep 23 '23 13:09 MichelFR

Hi, @mengjian-github

I'm helping the langchainjs team manage their backlog and am marking this issue as stale. From what I understand, you are seeking guidance on receiving the actual output from a stream, which is currently being received as JSON in the handleLLMNewToken function. There have been suggestions and workarounds from other users, as well as inquiries about progress and updates, indicating ongoing interest in resolving the issue.

Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!

dosubot[bot] avatar Dec 23 '23 16:12 dosubot[bot]