Lucas
Lucas
@jaredpalmer that actually won't work after doing some research https://github.com/vercel-labs/ai/issues/205 As an exaggerated example, a chain can spawn a chain that spawns a chain, meaning that on the first chainEnd,...
Updated with my latest findings 👀 It may be oversimplified compared to what they have done with tracing but I think it covers all the use-cases.
Take https://github.com/vercel-labs/ai/issues/205#issuecomment-1603437504 as an example. Let's say we wanted to stream both the responses. in first chain ``` Tragedy at Sunset on the Beach is a story of love, loss,...
Thank you 🙏
There's a couple of things that probably need to be in an FAQ or something Here the code that worked for me. ```javascript import { LangChainStream } from "@/lib/LangChainStream"; import...
@lib/LangChainStream ```javascript title import { type AIStreamCallbacks, createCallbacksTransformer } from 'ai' export function LangChainStream(callbacks?: AIStreamCallbacks) { const stream = new TransformStream() const writer = stream.writable.getWriter() const runs = new Set()...
It's all in the handlers From a brief look at https://api.python.langchain.com/en/stable/_modules/langchain/callbacks/streaming_stdout_final_only.html, you'd modify the handleNewToken ```python def on_llm_new_token(self, token: str, **kwargs: Any) -> None: """Run on new LLM token. Only...
It's a problem with SDK at the moment https://github.com/vercel-labs/ai/issues/205 The handlers that are being returned don't necessarily call writer.close()
First step toward https://github.com/react-hook-form/documentation/issues/915#issuecomment-1597798175
@bluebill1049 Need help looking at logs. I think it's related to https://github.com/contentlayerdev/contentlayer/issues/506