langserve icon indicating copy to clipboard operation
langserve copied to clipboard

Can the stream interface exposed by langserve provide an interface for gracefully aborting requests?

Open oldwinter opened this issue 1 year ago • 1 comments
trafficstars

just as title, our client calls the server-side code of langserve that I wrote. They have a requirement to frequently interrupt a certain stream request and, after interruption, they hope to

  1. Able to accurately calculate the number of openai tokens for the consumption.
  2. And be able to correctly preserve the text that has already been generated before the interruption.

For question 1, I guess you may need to use on_chain_error or on_llm_error, but no matter how I set it up, I can't trigger these two callback functions. For question 2, the current chain I am using is RunnableWithMessageHistory. I would like to make it support preserving the text generated before interruption, but I don't know how to do it. Or is there any other way to achieve preserving the history of the conversation?

oldwinter avatar Jan 15 '24 10:01 oldwinter

Hi @oldwinter, that's not possible at the moment since the endpoint only supports one way streaming communication (from server to client). We might add this functionality at some point, but for now you'll probably need to write something custom to support that.

eyurtsev avatar Jan 16 '24 16:01 eyurtsev