linguistic-style-transfer-pytorch icon indicating copy to clipboard operation
linguistic-style-transfer-pytorch copied to clipboard

Shape error in generate.py

Open michaeldu1 opened this issue 4 years ago • 4 comments

Hi, thanks for your great work!! I was trying to reproduce your results and was able to train the model successfully. However, when trying to run generate.py, I had a shape error:

`Enter the source sentence: the book is good Enter the target style: pos or neg: neg sentence shape is torch.Size([2, 1, 256]) Traceback (most recent call last): File "generate.py", line 42, in target_tokenids = model.transfer_style(token_ids, target_style_id)

File "/data/home/surviv/linguistic-style-transfer-pytorch/linguistic_style_transfer_pytorch/model.py", line 511, in transfer_style final_hidden_state)

File "/data/home/surviv/linguistic-style-transfer-pytorch/linguistic_style_transfer_pytorch/model.py", line 227, in get_content_emb mu = self.content_mu(sentence_emb)

File "/data/anaconda/envs/py35/lib/python3.5/site-packages/torch/nn/modules/module.py", line 547, in call result = self.forward(*input, **kwargs)

File "/data/anaconda/envs/py35/lib/python3.5/site-packages/torch/nn/modules/linear.py", line 87, in forward return F.linear(input, self.weight, self.bias)

File "/data/anaconda/envs/py35/lib/python3.5/site-packages/torch/nn/functional.py", line 1371, in linear output = input.matmul(weight.t()) RuntimeError: size mismatch, m1: [2 x 256], m2: [512 x 128] at /opt/conda/conda-bld/pytorch_1565272269120/work/aten/src/TH/generic/THTensorMath.cpp:752`

It seems like the error is in model.py in line 227. Do you know why this shape error exists? Thanks so much for your work and help, I look forward to your response!!

michaeldu1 avatar Mar 12 '20 02:03 michaeldu1

@michaeldu1 I'm really glad you liked the work.

The issue seems to be in the shape of the final_hidden_state, which is the output of self.encoder(embedded_seq) in line 504 in model.py. I think the error can be resolved by using the same logic used in line 95 of forward() to obtain the sentence_emb. Feel free to fix it and make a PR 😃.

I'm really sorry for the late reply 😅.

h3lio5 avatar Mar 14 '20 14:03 h3lio5

I'm sorry for late reply. actually I haven't used this function because I haven't use the save embedding to generate sentence. today, I look through the source code carefully. And I found another bug. when the last batch which batch_size < mconfig.batch_size, it will cause shape error. at model.py line 439

sos_token_tensor = torch.LongTensor(
                [gconfig.predefined_word_index['<sos>']]).cuda().unsqueeze(0).repeat(mconfig.batch_size, 1)
input_sentences = torch.cat(
                    (sos_token_tensor, input_sentences), dim=1)

the input_sentences.shape[0] is not equal to sos_token_tensor.shape[0] due to the last batch_size is less than mconfig.batch_size. @h3lio5 @michaeldu1

Doragd avatar Mar 15 '20 11:03 Doragd

@Doragd yes I am experiencing the same error, it appears to be because of what you are saying exactly. On the last iteration of the epoch the size does not match with with mconfic.batch_size and the following error was obtained:

   sequences, seq_lens.squeeze(1), labels, bow_rep, iteration+1, epoch == mconfig.epochs-1)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/root/linguistic-style-transfer-pytorch/linguistic_style_transfer_pytorch/model.py", line 152, in forward
    sequences, generative_emb)
  File "/root/linguistic-style-transfer-pytorch/linguistic_style_transfer_pytorch/model.py", line 428, in generate_sentences
    (sos_token_tensor, input_sentences), dim=1)
RuntimeError: Sizes of tensors must match except in dimension 0. Got 49 and 128 (The offending index is 0)
Epoch:   0%| ```

If someone has any luck fixing this, please let me know! Thanks.

sharan21 avatar Apr 02 '21 09:04 sharan21

@Doragd yes I am experiencing the same error, it appears to be because of what you are saying exactly. On the last iteration of the epoch the size does not match with with mconfic.batch_size and the following error was obtained:

   sequences, seq_lens.squeeze(1), labels, bow_rep, iteration+1, epoch == mconfig.epochs-1)
  File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/root/linguistic-style-transfer-pytorch/linguistic_style_transfer_pytorch/model.py", line 152, in forward
    sequences, generative_emb)
  File "/root/linguistic-style-transfer-pytorch/linguistic_style_transfer_pytorch/model.py", line 428, in generate_sentences
    (sos_token_tensor, input_sentences), dim=1)
RuntimeError: Sizes of tensors must match except in dimension 0. Got 49 and 128 (The offending index is 0)
Epoch:   0%| ```

If someone has any luck fixing this, please let me know! Thanks.

Sorry, I've given up this project and forgotten about any details.

Doragd avatar Apr 07 '21 07:04 Doragd