pointer_summarizer icon indicating copy to clipboard operation
pointer_summarizer copied to clipboard

'Encoder' object has no attribute 'tx_proj'

Open jivatneet opened this issue 3 years ago • 3 comments

I was successfully able to run the LSTM based pointer generator. While running the transformer_encoder branch with LSTM=false, I encounter this error:

File "training_ptr_gen/train.py", line 400, in <module>
    train_processor.trainIters(config.max_iterations, args.model_file_path)
  File "training_ptr_gen/train.py", line 341, in trainIters
    loss = self.train_one_batch(batch)
  File "training_ptr_gen/train.py", line 273, in train_one_batch
    encoder_outputs, encoder_feature, encoder_hidden = self.model.encoder(enc_batch, enc_lens, enc_padding_mask)
  File "/srv/home/kaur/pointer_summarizer/lstmpg/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/srv/home/kaur/pointer_summarizer/training_ptr_gen/model.py", line 108, in forward
    word_embed_proj = self.tx_proj(embedded)
  File "/srv/home/kaur/pointer_summarizer/lstmpg/lib/python3.6/site-packages/torch/nn/modules/module.py", line 779, in __getattr__
    type(self).__name__, name))
torch.nn.modules.module.ModuleAttributeError: 'Encoder' object has no attribute 'tx_proj'

Any help regarding this would be appreciated, thanks.

jivatneet avatar Jan 24 '21 20:01 jivatneet

You can set use_lstm=True to handle this issue.

v-chuqin avatar Jan 26 '21 13:01 v-chuqin

Thanks for the reply @v-chuqin, but I wanted to use the transformer-based encoder and hence set use_lstm=False. With use_lstm=True, it is working fine.

jivatneet avatar Jan 26 '21 13:01 jivatneet

How did you solve that? I want to use transformers and I got the same error

Tinarights avatar Dec 27 '21 05:12 Tinarights