transformer icon indicating copy to clipboard operation
transformer copied to clipboard

Reporting a bug during test time

Open mediadepp opened this issue 2 years ago • 0 comments

@hyunwoongko thanks for your nice implementation. By the way, I wanna point out an issue. If you notice, while testing, you are utilising the following code:

def test_model(num_examples):
    iterator = test_iter
    # model.load_state_dict(torch.load("./saved/model-saved.pt"))

    with torch.no_grad():
        batch_bleu = []
        for i, batch in enumerate(iterator):
            src = batch.src
            trg = batch.trg
            output = model(src, trg[:, :-1])
            ...

I guess this part has a significant problem that cannot be ignored. While testing, we don't have any target. The target sentence should be created word by word based on the output of the decoder. You should have a loop for that. You can take a look at here.

mediadepp avatar Jul 11 '22 12:07 mediadepp