FlagEmbedding
FlagEmbedding copied to clipboard
Continued pretraining using RetroMAE
Hi I'm trying to try out RetroMAE pretraining of your model on my domain data. Do you make available the encoder MLM head and decoder you used during pretraining stage to perform continued pretraining on bge-v1.5-large (BAAI/bge-large-en-v1.5). Pretraining with random weight initialisation is making the pretraining process vastly more difficult. would really appreciate access to the model checkpoints after pretraining.