vector-quantize-pytorch
vector-quantize-pytorch copied to clipboard
Allow for sample_codebook_temp to be annealed?
trafficstars
Hi! I've seen commit 1da641b (version = '1.5.16') with message 'allow for temperature to be annealed'.
But the sample_codebook_temp seems constant during training. How to turn on annealed temperature?
@Pachark hey Patrick! glad you are interested in this and i am curious about your results if you do try it. i saw a lot of success using the stochastic sampling technique (but don't take my word for it as i'm a poor experimentalist)
so you can actually override the sample_codebook_temp on forward of the vq module, with your preferred annealing schedule. i haven't thought of a good way to manage the schedule within the class itself yet, not without complicating things