ai
ai copied to clipboard
LangChain stream not updating the 'isLoading' boolean
Hi!
I'm building a small PoC with your amazing SDK, and I'm having a little issue with the 'isLoading' flag from the useChat hook. The stream apparently ends, but the isLoading boolean is stuck on true even when the API stops generating a response.
I can chat properly, though.
This is the code for the route:
import { CallbackManager } from 'langchain/callbacks'
import { StreamingTextResponse, LangChainStream, Message } from 'ai'
import { ChatOpenAI } from 'langchain/chat_models/openai'
import { AIChatMessage, HumanChatMessage } from 'langchain/schema'
import {
ChatPromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate
} from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'
import { HUMAN_PROMPT, SYSTEM_PROMPT } from '@/app/data/constants'
export const runtime = 'edge'
export async function POST(req: Request) {
const json = await req.json()
const { messages } = json
const { stream, handlers } = LangChainStream()
const chatPrompt = ChatPromptTemplate.fromPromptMessages([
SystemMessagePromptTemplate.fromTemplate(SYSTEM_PROMPT),
HumanMessagePromptTemplate.fromTemplate(HUMAN_PROMPT)
])
const model = new ChatOpenAI({
openAIApiKey: process.env.OPENAI_API_KEY,
streaming: true,
modelName: 'gpt-3.5-turbo',
temperature: 0.1,
callbackManager: CallbackManager.fromHandlers(handlers)
})
const chain = new LLMChain({ llm: model, prompt: chatPrompt })
chain
.call({
prompt_variable: (messages as Message[]).map(
m => {
if (m.role == 'user') return m.content
},
(messages as Message[]).map(m =>
m.role == 'user'
? new HumanChatMessage(m.content)
: new AIChatMessage(m.content)
)
)
})
.catch(console.error)
return new StreamingTextResponse(stream)
}
Thank you, and congratulations, this is an amazing project!
Noticing the same problem.
Same here.
same, would u please fix it? thanks
Are there any possible workarounds in the meantime?
Are there any possible workarounds in the meantime?
tried onFinish,not working.
onFinish: (message) => {
setIsResponding(false)
console.log('Chat stream ended with message:', message);
},
This is due to #97 ("Stream never closes with LangChainStream using postman") and there are some workarounds there.
This solution highlighted in the issue #97 worked for me:
import { StreamingTextResponse, LangChainStream, Message } from 'ai'
import { CallbackManager } from 'langchain/callbacks'
import { ChatOpenAI } from 'langchain/chat_models/openai'
import { AIChatMessage, HumanChatMessage } from 'langchain/schema'
export const runtime = 'edge'
export async function POST(req: Request) {
const { messages } = await req.json()
const { stream, handlers } = LangChainStream({
onStart: async () => console.log('start'),
// onToken: async token => console.log(token),
onCompletion: async () => console.log('end')
})
const llm = new ChatOpenAI({
streaming: true,
callbackManager: CallbackManager.fromHandlers(handlers)
})
llm
.call(
(messages as Message[]).map(m =>
m.role == 'user'
? new HumanChatMessage(m.content)
: new AIChatMessage(m.content)
)
)
.catch(console.error)
.finally(() => {
// Call handleStreamEnd when the chat or stream ends
handlers.handleChainEnd()
})
return new StreamingTextResponse(stream)
}
The above solution worked for me! Yo, thanks a lot @raw-brt!
I am getting this error with vercel build:
Type error: Expected 2 arguments, but got 0.
33 | .catch(console.error)
34 | .finally(() => {
> 35 | handlers.handleChainEnd();
| ^
36 | });
37 |
38 | return new StreamingTextResponse(stream);
npm ERR! Lifecycle script `build` failed with error:
This should be fixed by https://github.com/vercel-labs/ai/pull/201
Thanks @aranlucas!