MoEBERT
MoEBERT copied to clipboard
"Need to turn the model to a MoE first" error
I just remove "--do_train" and "--do_eval" lines in bert_base_mnli_example.sh, an add a line that"--do_predict". But when I run it, "Need to turn the model to a MoE first" error happens. I wonder why it happens, thanks a lot.
And the loaded model is the already trained model.
And When I Use bert_base_mnli_example.sh , add a --preprocess_importance argument, remove the --do_train argument to compute the importance scores, "FileNotFoundError: [Errno 2] No such file or directory: 'importance_files/importance_mnli.pkl'" error happens. where can I get that file? Thanks a lot
Hi @shadymcy, have you solve this problems? I have encountered the same! Many thanks!
@CaffreyR I am not....Sorry
commenting out importance_processer in transformers/models/bert/modeling_bert_moe.py will work:
#self.importance_processor = ImportanceProcessor(config, layer_idx, config.moebert_expert_num, 0)
#if not self.importance_processor.is_moe:
# raise RuntimeError("Need to turn the model to a MoE first.")
@shadymcy @CaffreyR @Harry-zzh