langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

`handleLLMStart` prompts param is difficult to use

Open mdatsev opened this issue 1 year ago • 0 comments

handleLLMStart's prompts parameter is currently a stringified representation of the actual messages being sent to the LLM, which makes it difficult to use.

Currently the only way to handle the message separately is to parse this string, which may be impossible in some cases (for example if one of the messages contain Human: ).

Where the problem is in the code: src/chat_models/base.ts:53 - prompts are getting passed through getBufferString

Here is some code to reproduce the issue:

import { ChatOpenAI } from "langchain/chat_models/openai";
import { CallbackManager } from "langchain/callbacks";
import { HumanChatMessage, SystemChatMessage } from "langchain/schema";

const chat = new ChatOpenAI({
  callbackManager: CallbackManager.fromHandlers({
    handleLLMStart: (llm, prompts) => {
      console.log(prompts); // [ 'System: You are a helpful assistant.\nHuman: What is 2+2?' ]
    },
  }),
});
await chat.call([
  new SystemChatMessage("You are a helpful assistant."),
  new HumanChatMessage("What is 2+2?"),
]);

mdatsev avatar Apr 22 '23 17:04 mdatsev