xxxfzxxx
xxxfzxxx
any progress on sparse and colbert embedding?
I got the same error, how to resolve?
search_param_dense = { "data": dense_embeddings, "anns_field": "dense_vector", "param": { "metric_type": "COSINE", "params": {"nprobe": 10} }, "limit": 100 # TODO hybrid search bug https://github.com/milvus-io/milvus/issues/32288 } search_param_sparse = { "data": sparse_embeddings, "anns_field":...
Is there a reason?
"Has the issue mentioned in the error message been resolved after upgrading pymilvus?" NO.
I don't understand what is the difference between limit: 10 and limit: 1000. Because you will eventually calculate the similarity scores across all entities and select top 10, or top...
hi, I met a similar error. I use bm25 embedding function, I use encode_queries function: sparse_embeddings = self.bm25_ef.encode_queries([rewritten_query]) but sparse_embedding is return empty. Why is that? the bm25_ef is def...
I wonder how the milvus builtin bm25embeddingFunction will embed the unseen word in the query? From my observation, it will give nothing(None). What is the best solution if the tokens...
> > hi, I met a similar error. I use bm25 embedding function, I use encode_queries function: sparse_embeddings = self.bm25_ef.encode_queries([rewritten_query]) but sparse_embedding is return empty. Why is that? the bm25_ef...