LlamaIndexTS
LlamaIndexTS copied to clipboard
Using Multiple StorageContexts with VectorStoreIndex for Different Users ?
I'm looking into using multiple storage contexts for initializing VectorStoreIndex. My dataset includes 20 documents, each relevant to particular users.
For instance, I'd like to set up a storage context with Documents A, B, and D for one user query, and Documents A, B, and F for another.
Is this feasible? Should I generate all document vectors first and then apply metadata filters?
Additionally, how would I go about filtering these in ChatEngine?
@leoz2007 I think you can add metadata to the documents that can be used to relate the user to the document and then query using that metadata. In the simplest case, that would be adding the userId to the metadata.
Here's an example of filtering by metadata: https://github.com/run-llama/LlamaIndexTS/blob/main/examples/chromadb/preFilters.ts
hey @marcusschiesser ! but in @leoz2007 scenario, we should have multiple userIds added to metadata, because Documents A and B should be available for both users. is it possible to filter documents in the way that the metadata field is a list of values and it must contain a specific value to satisfy the filter condition?
or maybe vice-versa: each user has a list of accessible documenIds and is it possible to set up filters in a way that a specific metadata field value must be IN a list passed to filters?
Yes, I agree working with doc IDs sounds like a better solution to me too. We're working to integrate something similar into create-llama, should be available next week
that's great! just to clarify: are you planning to introduce new types of MetadataFilters other than ExactMatchFilter, right?
@dimatill yes there are now in release 0.5.5, see https://ts.llamaindex.ai/modules/query_engines/metadata_filtering - does that fit your needs? if not please reopen the ticket
@marcusschiesser yes, it theoretically fits, but I'm using pgvector, and it looks like it's still not implemented for pgvector 🥲
https://github.com/run-llama/LlamaIndexTS/blob/main/packages/llamaindex/src/storage/vectorStore/PGVectorStore.ts#L275
@dimatill yes, that's true. Would you mind sending a PR adding that support? I guess it would be similar to Milvus which we just added
yes, I'll take a look