are-16-heads-really-better-than-1 icon indicating copy to clipboard operation
are-16-heads-really-better-than-1 copied to clipboard

Is the code still able to run?

Open bing0037 opened this issue 3 years ago • 3 comments

Hi,

I am trying to reproduce your result of BERT. I followed the Prerequisite:

# Pytorch pretrained BERT
git clone https://github.com/pmichel31415/pytorch-pretrained-BERT
cd pytorch-pretrained-BERT
git checkout paul
cd ..
# Install the pytorch-pretrained_BERT:
cd pytorch-pretrained-BERT
pip install .
cd ..
# Run the code:
bash experiments/BERT/heads_ablation.sh MNLI

But got this error:

02:06:57-INFO: Weights of BertForSequenceClassification not initialized from pretrained model: ['classifier.weight', 'classifier.bias']
02:06:57-INFO: Weights from pretrained model not used in BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']
Traceback (most recent call last):
  File "pytorch-pretrained-BERT/examples/run_classifier.py", line 582, in <module>
    main()
  File "pytorch-pretrained-BERT/examples/run_classifier.py", line 275, in main
    model.bert.mask_heads(to_prune)
  File "/home/guest/anaconda3/envs/huggingface_env/lib/python3.6/site-packages/torch/nn/modules/module.py", line 594, in __getattr__
    type(self).__name__, name))
AttributeError: 'DataParallel' object has no attribute 'bert'


1(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error

Any idea or suggestion?

bing0037 avatar Aug 28 '21 10:08 bing0037

Hi @bing0037 I haven't run this code in a while but it used to work. My first guess would be probably an incompatibility with a newer version of pytorch. Can you try again in an environment with pytorch 1.0 or 1.1?

If that doesn't solve it then I'm not too sure... I wasn't using DataParallel in the code so I'm not sure why it would show up in the eror message... let me know how changing the pytorch version goes.

pmichel31415 avatar Aug 30 '21 09:08 pmichel31415

I met the same problem today and solved it by adding the following code block in pytorch-pretrained-BERT/examples/run_classifier.py

# around line 260
model = torch.nn.DataParallel(model)
+ model = model.module

Hope it helps.

Reference:

  1. https://blog.csdn.net/weixin_41990278/article/details/105127101
  2. https://zhuanlan.zhihu.com/p/92759707

caidongqi avatar Jun 09 '22 07:06 caidongqi

Hi @caidongqi I tried changing the run_classifier.py file as you have done. But ran into the same errors as @bing0037. I was also trying to reproduce the result for BERT. I am using Python 3.8 and PyTorch 1.8.0

1(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error
        0.00000(standard_in) 2: syntax error

Any ideas or solutions to this?

vrunm avatar Feb 04 '23 10:02 vrunm