FlagEmbedding
FlagEmbedding copied to clipboard
bge-m3 default representation
Hello :)
First thank you for your amazing work!
When using bge-m3 within langchain what is the default representation of the encoding? Dense or a mix of differents (sparse...)?
Thanks for your interest in our work!
In langchain, you can use HuggingFaceEmbeddings
to load bge-m3, which will use sentence-transformers tool to generate dense embedding.
Thanks for your interest in our work! In langchain, you can use
HuggingFaceEmbeddings
to load bge-m3, which will use sentence-transformers tool to generate dense embedding.
how to generate sparse vector by HuggingFaceEmbeddings, please
Thanks for your interest in our work! In langchain, you can use
HuggingFaceEmbeddings
to load bge-m3, which will use sentence-transformers tool to generate dense embedding.how to generate sparse vector by HuggingFaceEmbeddings, please
please refer to https://github.com/FlagOpen/FlagEmbedding/issues/585