GRUV
GRUV copied to clipboard
How to continue training without losing the song structure | hidden_dimension_size handling
I've tested your code now with a lot of different song mixes, played around with the nn_params
value and have a few questions about training in general:
#For best results, this should be >= freq_space_dims, but most consumer GPUs can't handle large sizes
nn_params['hidden_dimension_size'] = 1024
What do you mean by freq_space_dims
– where is this value set?
How big should hidden_dimension_size
ideally be – even if demands more GPU memory than what's available? I have a GTX 780 with 6GB VRAM and try to use as much of the memory as possible.
When i increase the hidden_dimension_size
i soon realized that the required memory also increases drastically. I could go up to 100-160 batches with a value of 1024
, but with '2048' or '2560' i have to reduce the batch size to 20 - 60. The max i could work with was 3072
.
So is it better to train with a large batch size or smaller batches but higher hidden_dimension_size
?
What is the best balance between the two?
I try to let the training run as long as possible in one turn, but after 8-12 hours i have to give my GPU a rest or just stop the training because i need the GPU for other projects.
But everytime i restart the training from the last saved weights it completely scrambles the previous song structure – whereas i could hear better and clearer results of the iterations in one go.
So does continuing training from a previous point reset your progress – does it not refine the already generated song and simply start a new song?
Are there any other parameters i could crank up to speed up training / get better results after fewer iterations? Can you recommend settings for Theano (.theanorc parameters)?
Sorry, just one more question: Do you know any projects on github that use a similar approach like yours (working with raw audio)? I know there are a lot of LSTMs for MIDI, but i'm more interested in creating raw audio.
Thanks for reading this and i hope you find the time to answer my questions! Thanks again!