bert-toxic-comments-multilabel icon indicating copy to clipboard operation
bert-toxic-comments-multilabel copied to clipboard

AttributeError: 'BertForMultiLabelSequenceClassification' object has no attribute 'module'

Open amrjlg opened this issue 3 years ago • 3 comments

when i run the notebook to

model.module.unfreeze_bert_encoder()

got this error

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-48-e5502767395c> in <module>
----> 1 model.module.unfreeze_bert_encoder()

c:\users\jiang\.conda\envs\python3.6\lib\site-packages\torch\nn\modules\module.py in __getattr__(self, name)
    946                 return modules[name]
    947         raise AttributeError("'{}' object has no attribute '{}'".format(
--> 948             type(self).__name__, name))
    949 
    950     def __setattr__(self, name: str, value: Union[Tensor, 'Module']) -> None:

AttributeError: 'BertForMultiLabelSequenceClassification' object has no attribute 'module'

where i missed ?

amrjlg avatar Mar 29 '21 09:03 amrjlg

this is my args

args = {
    "train_size": -1,
    "val_size": -1,
    "full_data_dir": DATA_PATH,
    "data_dir": PATH,
    "task_name": "toxic_multilabel",
    "bert_model": BERT_PRETRAINED_PATH,
    "output_dir": CLAS_DATA_PATH/'output',
    "max_seq_length": 512,
    "do_train": True,
    "do_eval": True,
    "do_lower_case": True,
    "train_batch_size": 8,
    "eval_batch_size": 8,
    "learning_rate": 3e-5,
    "num_train_epochs": 4.0,
    "warmup_proportion": 0.1,
    "no_cuda": True,
    "local_rank": -1,
    "seed": 42,
    "gradient_accumulation_steps": 1,
    "optimize_on_cpu": True,
    "fp16": False,
    "loss_scale": 128
}

amrjlg avatar Mar 29 '21 09:03 amrjlg

python version is 3.6

amrjlg avatar Mar 29 '21 09:03 amrjlg

pytorch-pretrained-bert version is 0.6.2

amrjlg avatar Mar 29 '21 09:03 amrjlg