langchainjs
langchainjs copied to clipboard
Use Motörhead Memory will result message to empty
Result error:
node_modules/langchain/dist/memory/base.js:17
if (m._getType() === "human") {
^
TypeError: m._getType is not a function
When use with Motörhead, the follow code is:
const model = new ChatOpenAI({ openAIApiKey: process.env.OPENAI_API_KEY });
const memory = new MotorheadMemory({
sessionId: "user-id",
motorheadURL: "http://localhost:8080",
});
await memory.init(); // loads previous state from Motörhead 🤘
const context = memory.context
? `
Here's previous context: ${memory.context}`
: "";
const chatPrompt = ChatPromptTemplate.fromPromptMessages([
SystemMessagePromptTemplate.fromTemplate(
`The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.${context}`
),
new MessagesPlaceholder("history"),
HumanMessagePromptTemplate.fromTemplate("{input}"),
]);
const chain = new ConversationChain({
memory,
prompt: chatPrompt,
llm: model,
});
const res1 = await chain.call({ input: "Hi! I'm Jim." });
console.log({ res1 });
Getting the TypeError: m._getType is not a function
error when trying the ConversationSummaryMemory
module too.
Getting it with buffer memory, if anyone has a fix.
Update:
I was able to fix this by not using ChatOpenAI import and using new OpenAI({ modelName: "gpt-3.5-turbo" });
instead.
Hey here's a working example:
https://github.com/getmetal/motorhead/tree/main/examples/chat-js
Hi, @pftom! I'm here to help the LangChain team manage their backlog and I wanted to let you know that we are marking this issue as stale.
Based on my understanding, the issue you reported was about encountering a TypeError: m._getType is not a function
error when using Motörhead with the provided code. Other users, such as juanpujol and Jahb, have also experienced this error. However, Jahb was able to resolve it by not using the ChatOpenAI import and instead using new OpenAI({ modelName: "gpt-3.5-turbo" });
. Czechh even provided a working example in the comments.
Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.
Thank you for your contribution and we appreciate your understanding!