treelstm.pytorch
treelstm.pytorch copied to clipboard
Checkpoint saving may be not appropriate.
In your code:
if best < test_pearson:
best = test_pearson
checkpoint = {
'model': trainer.model.state_dict(),
'optim': trainer.optimizer,
'pearson': test_pearson, 'mse': test_mse,
'args': args, 'epoch': epoch
}
logger.debug('==> New optimum found, checkpointing everything now...')
torch.save(checkpoint, '%s.pt' % os.path.join(args.save, args.expname))
The test_pearson is used instead of dev_pearson and the test_pearson should not be used to choose your best model. I got test result (Pearson: 0.8616 MSE: 0.2626) which had the highest dev_pearson score.