meshed-memory-transformer icon indicating copy to clipboard operation
meshed-memory-transformer copied to clipboard

Problems while running test.py

Open jieun1128 opened this issue 4 years ago • 8 comments

I tried to test the model, but I got this problem. Does anybody know how to solve it? I used the environment provided here.

python test.py --features_path data/coco_detections.hdf5 --annotation_folder annotations

///////

Meshed-Memory Transformer Evaluation Evaluation: 0%| | 0/500 [00:00<?, ?it/s] Traceback (most recent call last): File "test.py", line 77, in scores = predict_captions(model, dict_dataloader_test, text_field) File "test.py", line 26, in predict_captions out, _ = model.beam_search(images, 20, text_field.vocab.stoi[''], 5, out_size=1) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/captioning_model.py", line 70, in beam_search return bs.apply(visual, out_size, return_probs, **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/beam_search/beam_search.py", line 71, in apply visual, outputs = self.iter(t, visual, outputs, return_probs, **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/beam_search/beam_search.py", line 104, in iter word_logprob = self.model.step(t, self.selected_words, visual, None, mode='feedback', **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/transformer/transformer.py", line 42, in step self.enc_output, self.mask_enc = self.encoder(visual) File "/home/aimaster/anaconda3/envs/m2release/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/transformer/encoders.py", line 62, in forward return super(MemoryAugmentedEncoder, self).forward(out, attention_weights=attention_weights) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/transformer/encoders.py", line 44, in forward out = l(out, out, out, attention_mask, attention_weights) File "/home/aimaster/anaconda3/envs/m2release/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/transformer/encoders.py", line 19, in forward att = self.mhatt(queries, keys, values, attention_mask, attention_weights) File "/home/aimaster/anaconda3/envs/m2release/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/transformer/attention.py", line 180, in forward out = self.attention(queries, keys, values, attention_mask, attention_weights) File "/home/aimaster/anaconda3/envs/m2release/lib/python3.6/site-packages/torch/nn/modules/module.py", line 493, in call result = self.forward(*input, **kwargs) File "/home/aimaster/lab_storage/hi/meshed-memory-transformer/models/transformer/attention.py", line 130, in forward att = torch.matmul(q, k) / np.sqrt(self.d_k) # (b_s, h, nq, nk) RuntimeError: cublas runtime error : the GPU program failed to execute at /pytorch/aten/src/THC/THCBlas.cu:450

jieun1128 avatar Apr 25 '21 15:04 jieun1128

hi,l wander how many time to take train?is it fast?

Jennifer-6 avatar Sep 15 '21 11:09 Jennifer-6

when run test.py ,the pretrained model "mesh_memory_transformer.pth"should place in which foler?

Jennifer-6 avatar Sep 16 '21 11:09 Jennifer-6

the reason may be the version of torch doesn't match the version of CUDA.

Vincent-Ww avatar Nov 17 '21 14:11 Vincent-Ww

when run test.py ,the pretrained model "mesh_memory_transformer.pth"should place in which foler?

It's should be in the root, the same level of test.py.

Vincent-Ww avatar Nov 17 '21 14:11 Vincent-Ww

Hi. When I run the test.py to evaluate, the model just generates <'unk'> without any other token. Is there same problem?

Timon0327 avatar Dec 24 '21 07:12 Timon0327

Hi. When I run the test.py to evaluate, the model just generates <'unk'> without any other token. Is there same problem?

Node, I can generate the captions well. It seems like you fail to load the vocab file into the m2 model.

Vincent-Ww avatar Dec 27 '21 05:12 Vincent-Ww

Thanks for your reply. But I load the 'vocab.pkl' file from the original git which shows that the 0 is 'unk'. And when I run the test.py, the model generates many 0 values firstly, then these values translate to token be the 'unk'.

Timon0327 avatar Dec 27 '21 06:12 Timon0327

Thanks for your reply. But I load the 'vocab.pkl' file from the original git which shows that the 0 is 'unk'. And when I run the test.py, the model generates many 0 values firstly, then these values translate to token be the 'unk'.

I met the same problem during testing, Beside, I had already load .pth. coco detection and pkl. I am still confused about such a weird thing,

JackBaron-s avatar Jun 09 '22 07:06 JackBaron-s