transformer
transformer copied to clipboard
Reporting a bug during test time
@hyunwoongko thanks for your nice implementation. By the way, I wanna point out an issue. If you notice, while testing, you are utilising the following code:
def test_model(num_examples):
iterator = test_iter
# model.load_state_dict(torch.load("./saved/model-saved.pt"))
with torch.no_grad():
batch_bleu = []
for i, batch in enumerate(iterator):
src = batch.src
trg = batch.trg
output = model(src, trg[:, :-1])
...
I guess this part has a significant problem that cannot be ignored. While testing, we don't have any target. The target sentence should be created word by word based on the output of the decoder. You should have a loop for that. You can take a look at here.