langchainjs icon indicating copy to clipboard operation
langchainjs copied to clipboard

How to check whether the anwser found from vector store

Open mshafnas opened this issue 2 years ago • 5 comments

I have created a node js application with open ai and the langchain, what I hear is if the answer for my question is not found I want to return the answer as I don't know instead of model's dynamic response below is my code

const { PDFLoader } = require("langchain/document_loaders/fs/pdf");
const { RecursiveCharacterTextSplitter } = require("langchain/text_splitter");
const { OpenAIEmbeddings } = require("langchain/embeddings/openai");
const { HNSWLib } = require("langchain/vectorstores/hnswlib");
const { RetrievalQAChain } = require("langchain/chains");
const { OpenAI } = require("langchain/llms/openai");
const express = require("express");
const cors = require('cors');
const bodyParser = require('body-parser');



let chain = '';
const app = express();
app.use(cors());
app.use(bodyParser.json());
const loader = new PDFLoader("documents/Additional_QA_A102.pdf");

const setup = async () => {
    const docs = await loader.load();

// splitter function
const splitter = new RecursiveCharacterTextSplitter({
  chunkSize: 1000,
  chunkOverlap: 20,
});

// created chunks from pdf
const splittedDocs = await splitter.splitDocuments(docs);

const embeddings = new OpenAIEmbeddings({openAIApiKey: "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"});

const vectorStore = await HNSWLib.fromDocuments(
  splittedDocs,
  embeddings
);

const vectorStoreRetriever = vectorStore.asRetriever();
const model = new OpenAI({
  openAIApiKey: "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",
  modelName: 'gpt-3.5-turbo'
});

chain = RetrievalQAChain.fromLLM(model, vectorStoreRetriever);
}
app.post('/ai-query', async (req, res) => {
  const { question } = req.body;
  console.log(question);
  const answer = await chain.call({
    query: question,
  })
  // const response = `Question is ${q}\n Answer: ${JSON.stringify(answer)}`;
  res.send(answer);
});

mshafnas avatar Dec 01 '23 12:12 mshafnas

🤖

Hello @mshafnas,

I'm Dosu, an assistance bot here to help you with your queries, bug troubleshooting, and contributions while we wait for a human maintainer. Feel free to ask anything regarding your issue and I'll do my best to assist.

Based on the code you've shared, it seems like you're using the RetrievalQAChain class to handle the question-answering process. This class doesn't inherently provide a mechanism to return a default response like "I don't know" when no relevant documents are found in the vector store.

However, you can modify your code to achieve this. You'll need to check if the relevantDocs array is empty after the retriever attempts to find relevant documents. If it is, you can return "I don't know" directly. If it's not, you can proceed with the chain.call function as usual.

Here's how you can modify your /ai-query endpoint to implement this:

app.post('/ai-query', async (req, res) => {
  const { question } = req.body;
  console.log(question);
  const relevantDocs = await vectorStoreRetriever.getRelevantDocuments(question);
  
  if (relevantDocs.length === 0) {
    res.send({ answer: "I don't know" });
  } else {
    const answer = await chain.call({
      query: question,
    })
    res.send(answer);
  }
});

In this code, relevantDocs.length === 0 checks if there are no relevant documents found by the retriever. If there are none, it sends "I don't know". If there are relevant documents, it proceeds with the chain.call to find the answer.

Please note that this is a simplified example and you might need to adjust it according to your specific use case and error handling strategy.

Let me know if you need further clarification or help.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Dec 01 '23 12:12 dosubot[bot]

it is not working

mshafnas avatar Dec 07 '23 03:12 mshafnas

Hi @mshafnas Not sure if this is your case if the expected answer doesn't exist in document it will return the response I don't know the answer to that question.

try below code

const chain = ConversationalRetrievalQAChain.fromLLM( model, vectorStoreRetriever, )

const data = await chromaDB.similaritySearch(prompt, 1) const context: any = [] data.forEach((item) => { context.push(Context:\n${item.pageContent}) })

const { text } = await chain.call({ question: prompt, chat_history: history, })

dev-vasanth avatar Feb 08 '24 07:02 dev-vasanth

Hi, @mshafnas,

I'm helping the langchainjs team manage their backlog and am marking this issue as stale. The issue was about handling cases where the answer for a question is not found in the vector store in a Node.js application using OpenAI and Langchain libraries. The author wanted to return a specific response, such as "I don't know," instead of the model's dynamic response. The suggested solution was not working, but another user provided a code snippet to handle the case where the expected answer doesn't exist in the document.

Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!

dosubot[bot] avatar May 09 '24 16:05 dosubot[bot]

The other user’s code didn’t work because he is using chromadb but my case I amusing vector store

On Thu, 9 May 2024 at 9:31 PM, dosubot[bot] @.***> wrote:

Hi, @mshafnas https://github.com/mshafnas,

I'm helping the langchainjs team manage their backlog and am marking this issue as stale. The issue was about handling cases where the answer for a question is not found in the vector store in a Node.js application using OpenAI and Langchain libraries. The author wanted to return a specific response, such as "I don't know," instead of the model's dynamic response. The suggested solution was not working, but another user provided a code snippet to handle the case where the expected answer doesn't exist in the document.

Could you please confirm if this issue is still relevant to the latest version of the langchainjs repository? If it is, please let the langchainjs team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!

— Reply to this email directly, view it on GitHub https://github.com/langchain-ai/langchainjs/issues/3478#issuecomment-2102954544, or unsubscribe https://github.com/notifications/unsubscribe-auth/AOFZSHFMDNYEU56N766GEFDZBOMXJAVCNFSM6AAAAABACX7CVOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMBSHE2TINJUGQ . You are receiving this because you were mentioned.Message ID: @.***>

mshafnas avatar May 09 '24 17:05 mshafnas