ai icon indicating copy to clipboard operation
ai copied to clipboard

fix: langchain stream not closing

Open aranlucas opened this issue 1 year ago • 9 comments

fixes: https://github.com/vercel-labs/ai/issues/97

The problem

  1. Streaming handlers should be used in Request callbacks - https://js.langchain.com/docs/production/callbacks/#when-do-you-want-to-use-each-of-these
  2. You can have multiple chains. You need to know when all the chains before closing the stream. Currently, the stream is being closed after the first chain End

Solution

  1. Update docs to reflect request callbaks
  2. Add internal state that keeps track of all of the runs happening. If there are no runs, then the stream is safe to close.

aranlucas avatar Jun 22 '23 17:06 aranlucas

🦋 Changeset detected

Latest commit: 0a26a5a9a23eb74559a2fb4ed2149a862ef3f330

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
ai Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

changeset-bot[bot] avatar Jun 22 '23 17:06 changeset-bot[bot]

Actually, I just remembered why @jaredpalmer made the change. Here's the initial issue: https://github.com/vercel-labs/ai/issues/63

It seems like we may want to let the consumer choose which callback to end on, as it's model dependent

MaxLeiter avatar Jun 22 '23 18:06 MaxLeiter

Yeah I figured that was the reasoning, thanks for confirming.

This change will probably break SequentialChain use case.

It seems like your suggestion might be the best one. Or langchainjs can expose have something like handleEnd where it doesn't depend on a model.

aranlucas avatar Jun 22 '23 19:06 aranlucas

Last one here before I create an issue: https://js.langchain.com/docs/production/callbacks/#multiple-handlers

Before

  const { stream, handlers } = LangChainStream()
 
  const llm = new ChatOpenAI({
    streaming: true,
    callbackManager: CallbackManager.fromHandlers(handlers)
  })
 
  llm
    .call(
      (messages as Message[]).map(m =>
        m.role == 'user'
          ? new HumanChatMessage(m.content)
          : new AIChatMessage(m.content)
      )
    )
    .catch(console.error)
 
  return new StreamingTextResponse(stream)

After

  // Different handlers for llm, agents, chains
  const { stream, llmHandlers, chainHandlers } = LangChainStream()
 
  // Do not pass handlers here
  const llm = new ChatOpenAI({
    streaming: true,
  })
 
  // Pass handlers here instead
  llm
    .call(
      (messages as Message[]).map((m) =>
        m.role == "user"
          ? new HumanChatMessage(m.content)
          : new AIChatMessage(m.content)
      ),
      [llmHandlers]
    )
    .catch(console.error);
 
  return new StreamingTextResponse(stream)

aranlucas avatar Jun 22 '23 19:06 aranlucas

Created https://github.com/vercel-labs/ai/issues/205 for further discussions

This PR will probably break the linked usecase - can be closed or left until we find something.

aranlucas avatar Jun 22 '23 19:06 aranlucas

Big fan of const { stream, llmHandlers, chainHandlers } = LangChainStream() lets do that

jaredpalmer avatar Jun 22 '23 23:06 jaredpalmer

@jaredpalmer that actually won't work after doing some research https://github.com/vercel-labs/ai/issues/205

As an exaggerated example, a chain can spawn a chain that spawns a chain, meaning that on the first chainEnd, the stream will close. You need to be able to count how many things are running otherwise only the first chain to finish will write to the stream, locking out the other chain results.

I'm leaning towards https://github.com/vercel-labs/ai/issues/205#issuecomment-1603411125 as the solution now. It should cover all use-cases, and keep the API the same. (Still need to update docs though to use on "Request callbacks" rather than "Constructor callbacks")

aranlucas avatar Jun 23 '23 00:06 aranlucas

Updated with my latest findings 👀

It may be oversimplified compared to what they have done with tracing but I think it covers all the use-cases.

aranlucas avatar Jun 23 '23 00:06 aranlucas

Take https://github.com/vercel-labs/ai/issues/205#issuecomment-1603437504 as an example. Let's say we wanted to stream both the responses.

in first chain

Tragedy at Sunset on the Beach is a story of love, loss, and redemption. It follows the story of two young lovers, Jack and Jill, who meet on a beach at sunset. They quickly fall in love and plan to spend the rest of their lives together. \n\nHowever, tragedy strikes when Jack is killed in a car accident. Jill is left devastated and unable to cope with the loss of her beloved. She spirals into a deep depression and begins to lose hope. \n\nJust when all seems lost, Jill discovers that Jack had left her a letter before his death. In the letter, he tells her that he will always love her and that she should never give up hope. With this newfound strength, Jill is able to find the courage to move on with her life and find happiness again. \n\nTragedy at Sunset on the Beach is a story of love, loss, and redemption. It is a story of hope and courage in the face of tragedy. It is a story of finding strength in the darkest of times and of never giving up.",     

handleChainEnd gets called, stream is closed. Second chain can't write the second response

Tragedy at Sunset on the Beach is a powerful and moving story of love, loss, and redemption. The play follows the story of two young lovers, Jack and Jill, whose plans for a future together are tragically cut short when Jack is killed in a car accident. \n\nThe play is beautifully written and the performances are outstanding. The actors bring a depth of emotion to their characters that is both heartbreaking and inspiring. The story is full of unexpected twists and turns, and the audience is taken on an emotional rollercoaster as Jill struggles to come to terms with her loss. \n\nThe play is ultimately a story of hope and resilience, and it is sure to leave audiences with a newfound appreciation for life and love. Tragedy at Sunset on the Beach is a must-see for anyone looking for an emotionally charged and thought-provoking experience."

There's probably many ways to fix this, but the way I suggested in this PR seems to work for most use cases I've tried.

aranlucas avatar Jun 23 '23 02:06 aranlucas

Are we good to merge @nfcampos @MaxLeiter ?

jaredpalmer avatar Jun 29 '23 14:06 jaredpalmer

looks good to me

nfcampos avatar Jun 29 '23 14:06 nfcampos

Thank you 🙏

aranlucas avatar Jun 29 '23 16:06 aranlucas