langchain-alpaca icon indicating copy to clipboard operation
langchain-alpaca copied to clipboard

If template contains a newline in streaming example, it does not work

Open mariobm opened this issue 1 year ago • 0 comments

If we use bigger template that involves new lines, a callback handleLLMNewToken does not get called. Code:

/* eslint-disable no-undef */
import { CallbackManager } from 'langchain/callbacks'
import { LLMChain } from 'langchain/chains'
import { PromptTemplate } from 'langchain/prompts'
import path from 'node:path'
import { AlpacaCppChat } from 'langchain-alpaca'
import { dirname } from 'path';
import { fileURLToPath } from 'url';
const __dirname = dirname(fileURLToPath(import.meta.url));

const template = 'Below is an instruction that describes a task.\n Write a response that appropriately completes the request: {prot}';
const prompt = new PromptTemplate({
  template: template,
  inputVariables: ['prot'],
})

const alpaca = new AlpacaCppChat({
  modelParameters: { model: path.join(__dirname, './models/ggml-alpaca-7b-q4.bin') },
  // stream output to console to view it on realtime
  streaming: true,
  callbackManager: CallbackManager.fromHandlers({
    handleLLMNewToken: (token) => {
      console.log("NEW TOKEN");
      process.stdout.write(token);
      return token;
    },
  }),
})

const chain = new LLMChain({ llm: alpaca, prompt: prompt })
const response = await chain.call({ prot: '2 + 2 = ' })
console.log(`response`, response, JSON.stringify(response))
alpaca.closeSession()

Debug console:

 langchain-alpaca:session "Below is an instruction that describes a task.\\ Write a response that appropriately completes the request: 2 + 2 = " +0ms
  langchain-alpaca:state onData {"doneInit":true,">":false,"prompt":false,"queue[0]":{"prompt":"","doneInput":true,"doneEcho":false,"outputStarted":false}} +0ms
  langchain-alpaca:data "Below is an instruction that describes a task.\\ Write a response that appropriately completes the request: 2 + 2 =  \r\n" +0ms
  langchain-alpaca:state onData {"doneInit":true,">":false,"prompt":false,"queue[0]":{"prompt":"","doneInput":true,"doneEcho":false,"outputStarted":false}} +9s
  langchain-alpaca:data "\u001b[0m4" +9s
  langchain-alpaca:state onData {"doneInit":true,">":true,"prompt":false,"queue[0]":{"prompt":"","doneInput":true,"doneEcho":false,"outputStarted":false}} +802ms
  langchain-alpaca:data "\u001b[0m\r\n> \u001b[1m\u001b[32m" +802ms

mariobm avatar Mar 28 '23 13:03 mariobm