TurboTransformers
TurboTransformers copied to clipboard
Error when loading roberta with transformers 3.4.0
Hi, I am using your library with Roberta for sequence classification, the problem raises when I use the lib with new transformers(3.4.0).
from transformers import AutoModelForSequenceClassification
import turbo_transformers
model = AutoModelForSequenceClassification.from_pretrained("roberta-base")
turbo_model = turbo_transformers.RobertaModel.from_torch(model.roberta)
This worked fine with default transformers 3.0.2, but the problem happened when I upgrade the transformers to 3.4.0 (older version cannot load the pretrained model in model hub properly) the error show
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/miniconda3/lib/python3.7/site-packages/turbo_transformers/layers/modeling_roberta.py", line 148, in from_torch
pooler = BertPooler.from_torch(model.pooler)
File "/opt/miniconda3/lib/python3.7/site-packages/turbo_transformers/layers/modeling_bert.py", line 379, in from_torch
pooler_params = to_param_dict(pooler)
File "/opt/miniconda3/lib/python3.7/site-packages/turbo_transformers/layers/utils.py", line 58, in to_param_dict
return {k: v for k, v in torch_module.named_parameters()}
AttributeError: 'NoneType' object has no attribute 'named_parameters'
May be AutoModel is a new interface.
Is RobertaForSequenceClassification ok for you?
both RobertaForSequenceClassification and AutoModelForSequenceClassification do not work, I notice that pooler become an option in later versions (specifically with params: add_pooling_layer), I do fix the problem with some modification (from my pull request) and it works for my case but It somehow violates your CI testing.
The Turbo is not ready for transformers 3.4.0. You can maintain a local turbo version for youself.