bertram icon indicating copy to clipboard operation
bertram copied to clipboard

AttributeError: 'Bertram' object has no attribute 'transformer'

Open rajicon opened this issue 4 years ago • 4 comments

I'm getting this error when training the subword section. For some reason, the @property tag and the getattr() functions don't seem to be working together correctly. Is this a problem on your end?

rajicon avatar Dec 03 '21 22:12 rajicon

I've looked deeper into the problem, and it seems it is because getattr() returns an error which then makes the property tag not work correctly. Here is the error when getattr() is called:

File "/home/rpatel17/gensim_experiments/bertram-master/bertram.py", line 196, in setup if not isinstance(self.transformer().embeddings.word_embeddings, OverwriteableEmbedding) and form_and_context: File "/home/rpatel17/gensim_experiments/bertram-master/bertram.py", line 189, in transformer return getattr(self, self.bertram_config.transformer_cls) File "/cm/shared/apps/pytorch/1.4.0-p36/lib/python3.6/site-packages/torch/nn/modules/module.py", line 576, in getattr type(self).name, name)) AttributeError: 'Bertram' object has no attribute 'bert'

any advice? I'm using bert-base-uncased for my model name.

rajicon avatar Dec 03 '21 23:12 rajicon

So I think this is solved by changing line 193 in bertram.py to:

    if form_and_context and not isinstance(self.transformer.embeddings.word_embeddings, OverwriteableEmbedding):

By switching the conditions, self.transformer is not called if form_and_context is false, which happens in the subword training scenario.

Is this correct?

EDIT: Line 193, not 197

rajicon avatar Dec 07 '21 05:12 rajicon

Hi @rajicon, sorry for the late reply! I recall that I added the concept of OverwriteableEmbeddings at the very end of this research project and didn't retrain a form-only model after that, so I must have missed this. Your solution looks absolutely correct to me :)

timoschick avatar Dec 07 '21 15:12 timoschick

@timoschick Thanks! Also another quick question: When running the form-only section, it seems to take quite long (for comparison, I believe training the context was roughly 3.5 hours per epoch while subwords is 6.5). I haven't gotten a chance to look deeper into this, but is this behavior similar to when you ran it? Or is there a possible explanation to why the form-only takes that long?

rajicon avatar Dec 07 '21 21:12 rajicon