torch-rnn
torch-rnn copied to clipboard
Add epoch and validation loss to checkpoint
Update the checkpoint filename to include the epoch value and validation loss value so we sample against a checkpoint with a lower (or not!) validation loss. From Karpathy's char-rnn implementation.
Looks good. I think versioning checkpoints on more parameters is a sensible idea. Quick question - why do we need to store both the epoch value, and iteration number? Can epochs not be derived from iteration count?