llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Nomic Embed Text V2 with Mixture-of-Experts (MoE) architecture

Open manyoso opened this issue 7 months ago • 2 comments

  • Adds MoE-based embedding model supporting multilingual embeddings.
  • Selects architecture variant based on hyperparameter detection (MoE layers).
  • Removes unnecessary subclass initialization checks for clarity.

https://www.nomic.ai/blog/posts/nomic-embed-text-v2

Make sure to read the contributing guidelines before submitting a PR

manyoso avatar Mar 19 '25 13:03 manyoso