LlamaIndexTS
LlamaIndexTS copied to clipboard
OpenAIAgent chat returns empty sourceNodes and metadata
Hello LlamaIndexTS, I am currently using OpenAIAgent with QueryEngineTool from a custom VectorStoreIndex. The agent is responding with correct information from the query engine but the sourceNodes and metadata is empty. Not sure if this is a bug or me misusing the tool.
Thanks in advance,
const menuContext = await storageContextFromDefaults({
persistDir: "./menuDB",
});
const menuIndex = await VectorStoreIndex.fromDocuments([], {
storageContext: menuContext,
});
const menuRetriever = await menuIndex.asRetriever()
const menuRetrieverQueryEngine = await menuIndex.asQueryEngine({
retriever: menuRetriever,
});
////////////////////////////////
const openaiLLM = new OpenAI({ model: "gpt-4o", temperature: 0 });
const agent = new OpenAIAgent({
systemPrompt: systemMessage,
verbose: true,
model: openaiLLM, tools: [
// ..... more tools here
new QueryEngineTool({
queryEngine: menuRetrieverQueryEngine,
metadata: {
name: "menu_tool",
description: `This tool can answer about items on the menu`,
},
})
]
});
const response = await agent.chat({ message: 'do you burgers?', verbose: true })
// response EngineResponse { sourceNodes: undefined, metadata: {}, message: { content: 'We have two burgers:\n' + ......