ai icon indicating copy to clipboard operation
ai copied to clipboard

LangChain stream not updating the 'isLoading' boolean

Open raw-brt opened this issue 1 year ago • 8 comments

Hi!

I'm building a small PoC with your amazing SDK, and I'm having a little issue with the 'isLoading' flag from the useChat hook. The stream apparently ends, but the isLoading boolean is stuck on true even when the API stops generating a response.

I can chat properly, though.

This is the code for the route:

import { CallbackManager } from 'langchain/callbacks'
import { StreamingTextResponse, LangChainStream, Message } from 'ai'
import { ChatOpenAI } from 'langchain/chat_models/openai'
import { AIChatMessage, HumanChatMessage } from 'langchain/schema'
import {
  ChatPromptTemplate,
  HumanMessagePromptTemplate,
  SystemMessagePromptTemplate
} from 'langchain/prompts'
import { LLMChain } from 'langchain/chains'
import { HUMAN_PROMPT, SYSTEM_PROMPT } from '@/app/data/constants'

export const runtime = 'edge'

export async function POST(req: Request) {
  const json = await req.json()
  const { messages } = json

  const { stream, handlers } = LangChainStream()

  const chatPrompt = ChatPromptTemplate.fromPromptMessages([
    SystemMessagePromptTemplate.fromTemplate(SYSTEM_PROMPT),
    HumanMessagePromptTemplate.fromTemplate(HUMAN_PROMPT)
  ])

  const model = new ChatOpenAI({
    openAIApiKey: process.env.OPENAI_API_KEY,
    streaming: true,
    modelName: 'gpt-3.5-turbo',
    temperature: 0.1,
    callbackManager: CallbackManager.fromHandlers(handlers)
  })

  const chain = new LLMChain({ llm: model, prompt: chatPrompt })

  chain
    .call({
      prompt_variable: (messages as Message[]).map(
        m => {
          if (m.role == 'user') return m.content
        },
        (messages as Message[]).map(m =>
          m.role == 'user'
            ? new HumanChatMessage(m.content)
            : new AIChatMessage(m.content)
        )
      )
    })
    .catch(console.error)

  return new StreamingTextResponse(stream)
}

Thank you, and congratulations, this is an amazing project!

raw-brt avatar Jun 21 '23 14:06 raw-brt

Noticing the same problem.

alexyoung23j avatar Jun 21 '23 18:06 alexyoung23j

Same here.

theotarr avatar Jun 21 '23 20:06 theotarr

same, would u please fix it? thanks

fjun99 avatar Jun 22 '23 03:06 fjun99

Are there any possible workarounds in the meantime?

terrytjw avatar Jun 22 '23 04:06 terrytjw

Are there any possible workarounds in the meantime?

tried onFinish,not working.

 onFinish: (message) => {
  setIsResponding(false)
  console.log('Chat stream ended with message:', message);
},

fjun99 avatar Jun 22 '23 05:06 fjun99

This is due to #97 ("Stream never closes with LangChainStream using postman") and there are some workarounds there.

gadicc avatar Jun 22 '23 11:06 gadicc

This solution highlighted in the issue #97 worked for me:

import { StreamingTextResponse, LangChainStream, Message } from 'ai'
import { CallbackManager } from 'langchain/callbacks'
import { ChatOpenAI } from 'langchain/chat_models/openai'
import { AIChatMessage, HumanChatMessage } from 'langchain/schema'

export const runtime = 'edge'

export async function POST(req: Request) {
  const { messages } = await req.json()

  const { stream, handlers } = LangChainStream({
    onStart: async () => console.log('start'),
    // onToken: async token => console.log(token),
    onCompletion: async () => console.log('end')
  })

  const llm = new ChatOpenAI({
    streaming: true,
    callbackManager: CallbackManager.fromHandlers(handlers)
  })

  llm
    .call(
      (messages as Message[]).map(m =>
        m.role == 'user'
          ? new HumanChatMessage(m.content)
          : new AIChatMessage(m.content)
      )
    )
    .catch(console.error)
    .finally(() => {
      // Call handleStreamEnd when the chat or stream ends
      handlers.handleChainEnd()
    })

  return new StreamingTextResponse(stream)
}

raw-brt avatar Jun 22 '23 12:06 raw-brt

The above solution worked for me! Yo, thanks a lot @raw-brt!

terrytjw avatar Jun 23 '23 05:06 terrytjw

I am getting this error with vercel build:

Type error: Expected 2 arguments, but got 0.
  33 |     .catch(console.error)
  34 |     .finally(() => {
> 35 |       handlers.handleChainEnd();
     |                ^
  36 |     });
  37 | 
  38 |   return new StreamingTextResponse(stream);
npm ERR! Lifecycle script `build` failed with error: 

maxlibin avatar Jun 30 '23 13:06 maxlibin

This should be fixed by https://github.com/vercel-labs/ai/pull/201

aranlucas avatar Jul 03 '23 23:07 aranlucas

Thanks @aranlucas!

MaxLeiter avatar Jul 03 '23 23:07 MaxLeiter