MoEBERT icon indicating copy to clipboard operation
MoEBERT copied to clipboard

"Need to turn the model to a MoE first" error

Open Harry-zzh opened this issue 2 years ago • 5 comments

I just remove "--do_train" and "--do_eval" lines in bert_base_mnli_example.sh, an add a line that"--do_predict". But when I run it, "Need to turn the model to a MoE first" error happens. I wonder why it happens, thanks a lot.

Harry-zzh avatar May 10 '22 04:05 Harry-zzh

And the loaded model is the already trained model.

Harry-zzh avatar May 10 '22 04:05 Harry-zzh

And When I Use bert_base_mnli_example.sh , add a --preprocess_importance argument, remove the --do_train argument to compute the importance scores, "FileNotFoundError: [Errno 2] No such file or directory: 'importance_files/importance_mnli.pkl'" error happens. where can I get that file? Thanks a lot

shadymcy avatar Jul 30 '22 13:07 shadymcy

Hi @shadymcy, have you solve this problems? I have encountered the same! Many thanks!

CaffreyR avatar Aug 17 '22 14:08 CaffreyR

@CaffreyR I am not....Sorry

shadymcy avatar Aug 18 '22 06:08 shadymcy

commenting out importance_processer in transformers/models/bert/modeling_bert_moe.py will work:

    #self.importance_processor = ImportanceProcessor(config, layer_idx, config.moebert_expert_num, 0)

    #if not self.importance_processor.is_moe:
    #    raise RuntimeError("Need to turn the model to a MoE first.")

@shadymcy @CaffreyR @Harry-zzh

wintersurvival avatar Dec 08 '22 04:12 wintersurvival