gpt4all
gpt4all copied to clipboard
[Feature] Custom Text Embedding
trafficstars
Feature Request
Hi team,
I’d like to ask whether it’s possible to use BGE-M3 as the text embedding model. After testing it, I found that BGE-M3 provides better context retrieval performance compared to Nomic Embed Text v1.5 in my use case.
Is there any compatibility concern or specific configuration required to integrate BGE-M3 for embeddings in this project?
Thank you in advance for your help.
Best regards,