langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Enable LLM to choose the filtering criteria in VectorDBQA

Open fpingham opened this issue 3 years ago • 0 comments

What to filter when using a vector store is highly dependent on the query. Different queries will require different filters. As an example, if we have an index of texts and we ask our agent to answer 'Why do people seek wealth according to Adam Smith?" the retrieved documents should be different than if we asked 'Why do people seek wealth according to Adam Smith in the Theory of Moral Sentiments"?'.

It would be ideal that the agent can handle these cases separately, using different filters for them both. This would probably entail adding an instance where the LLM decides the filters and uses a prompt that teaches it how to filter correctly (probably incorporates the documentation for the vector store that's being used).

Regarding the example notebook showcasing this functionality, I suggest something similar to the example provided above.

fpingham avatar Jan 24 '23 18:01 fpingham