torch-rnn
torch-rnn copied to clipboard
Sampling: incremental printing of generated text
While using sample.lua
to generate text to inspect how the quality is changing between checkpoints, it would be nice if each character could be printed as it's generated (like in char-rnn
) so you don't have to wait for the full sample to be created before you see any output. This wait is particularly painful if you need to generate large amounts of text to see how it's going; for my metadata training, I need 10-20k characters to see it sample from several different authors, and it takes something like an hour to do that much.
sample.lua
calls LanguageModel.lua
, and inside sample
, I think this could be done by adding a print call using the next_char
variable?