Results 35 comments of Lucas

This is a NextJS issue on Windows that is fixed in the latest canary not related to this project. Seems they had problems parsing the "use client"

Updating to latest release https://github.com/vercel/next.js/releases/tag/v13.3.1 should fix this

This also causes isLoading from useChat() to never go back to false when using LangChainStream. However, e-roy's fix seems to work.

I think the issue is caused by https://github.com/vercel-labs/ai/commit/be90740a03cae91609cae075c232c7460239f64c When you do what @nfcampos suggested overwriting the returned handlers ```javascript const { stream, handlers } = LangChainStream(); const model = new...

Yeah took me a while to get to that conclusion in https://github.com/vercel-labs/ai/issues/205#issuecomment-1603269455.

> I wanted to accumulate the stream until the entire object is in buffer, make it human readable, and not output JSON, Do you just want to "await" the call...

Is there a reason the input is an array? In theory, could it be a single element? Or is there some dependency on the previous object? It seems like you...

That would reduce the problem to "I have an input, I need to transform it, and stream the response" The use chat hook might not be the best in your...

This should be fixed by https://github.com/vercel-labs/ai/pull/201

This is maybe a little bit more complicated than I expected. They do seem to have a generic callback handler that works across Agent/LLM/Chain/Whatever https://github.com/hwchase17/langchainjs/tree/main/langchain/src/callbacks/handlers However, they need to keep...