FlagEmbedding
FlagEmbedding copied to clipboard
Can I fine-tune e5-mistral-7b-instruct using this repo?
Dear authors,
in addition to bge-series, I would like to see how other embedding model perform on my own custom dataset.
I was wondering if I can use the following fine-tuning script for fine-tuning e5-mistral-7b-instruct?
https://github.com/FlagOpen/FlagEmbedding/tree/master/examples/finetune
This is not suitable for fine-tuning e5-mistral-7b-instruct.