Lucas
Lucas
The first minor change to fix this is definitely updating the docs/demo to use request callback handlers on the `call` as mentioned by https://js.langchain.com/docs/production/callbacks/#when-do-you-want-to-use-each-of-these ``` When do you want to...
https://github.com/hwchase17/langchainjs/blob/main/langchain/src/callbacks/handlers/tracer.ts looks useful. If runMap is empty, close the writer()
Definitely not ready for primetime but a good demonstration ```typescript import { BaseTracer } from "langchain/callbacks"; import { type Run } from "langchain/dist/callbacks"; export class StreamingCallback extends BaseTracer { name...
Well, working backwards from that example I think I found the minimum needed to get handlers working for all cases. I created this class to replicate the library functionality ```javascript...
Tested with sequential chain (Example https://github.com/vercel-labs/ai/issues/63 https://js.langchain.com/docs/modules/chains/sequential_chain) ```javascript const { stream, handlers } = LangChainStream(); // This is an LLMChain to write a synopsis given a title of a play....
It was a fun ride to understanding internals of LangChain :)
Good to close :)
Yeah I figured that was the reasoning, thanks for confirming. This change will probably break `SequentialChain` use case. It seems like your suggestion might be the best one. Or langchainjs...
Last one here before I create an issue: https://js.langchain.com/docs/production/callbacks/#multiple-handlers Before ```javascript const { stream, handlers } = LangChainStream() const llm = new ChatOpenAI({ streaming: true, callbackManager: CallbackManager.fromHandlers(handlers) }) llm .call(...
Created https://github.com/vercel-labs/ai/issues/205 for further discussions This PR will probably break the linked usecase - can be closed or left until we find something.