langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

Stream ZeroShotAgent response using callbackManager

Open rmonvfer opened this issue 1 year ago • 0 comments

Hello everyone!

I'm building a chat interface using NextJS to interact with agents and I haven't been able to make it work using streaming.

This is the core logic:

const tools = [
    new RequestsGetTool(),
    new RequestsPostTool(),
    await AIPluginTool.fromPluginUrl(
        "URL [REDACTED]"
    ),
];

export default function makeChain(stream: TransformStream) {

    const prompt = ZeroShotAgent.createPrompt(tools, {
        prefix: `PROMPT [REDACTED]:`,
        suffix: `PROMPT [REDACTED]`,
    });

    const chatPrompt = ChatPromptTemplate.fromPromptMessages([
        new SystemMessagePromptTemplate(prompt),
        HumanMessagePromptTemplate.fromTemplate(`
        {input}
        
        [REDACTED]
        
        {agent_scratchpad}`),
    ]);

    const writer = stream.writable.getWriter();

    const llm = new ChatOpenAI({
        streaming: true,
        callbackManager: CallbackManager.fromHandlers({
            handleLLMNewToken: async (token) => {
                console.log(token)
                await writer.ready;
                await writer.write(new TextEncoder().encode(token));
            },
            handleLLMEnd: async () => {
                await writer.ready;
                await writer.close();
            },
            handleLLMError: async (e) => {
                await writer.ready;
                await writer.abort(e);
            },
        }),
    });

    console.log(chatPrompt)

    const llmChain = new LLMChain({
        prompt: chatPrompt,
        llm: llm,
    });

    const agent = new ZeroShotAgent({
        llmChain,
        allowedTools: tools.map((tool) => tool.name),
    });

    return AgentExecutor.fromAgentAndTools({ agent, tools });

This function is then called in a NextJS api route using something like:

export default async function handler(req: NextApiRequest): Promise<Response> {
  try {
    const { model, messages, key, prompt_ } = await req.body

    console.log(messages)
    
    // messages is just an array of { role: "user", content: "content here!" }
    const input = messages[messages.length - 1].content; // Last message
    const stream = new TransformStream();

    const chain = makeChain(stream);

    chain.run({ input }).catch((e) => console.error(e));

    return new Response(stream.readable, {
      headers: { "Content-Type": "text/event-stream" },
    });

  } catch (error: any) {
    console.error(error);
    if (error) {
      return new Response('Error', { status: 404, statusText: error.message });
    } else {
      return new Response('Error', { status: 500, statusTest: "Unknown Error!" });
    }
  }
};

I've been looking at the documentation for a while but all I managed to find was related to streaming LLM responses directly, without any indication about agents.

I don't really have any problem reading the source (I did) but maybe I'm missing something.

Does LangChain support streaming agent responses? Is there a recommended way of doing it?

Thank you!

rmonvfer avatar Apr 27 '23 09:04 rmonvfer