Tao Liu

Results 3 issues of Tao Liu

Thanks for your excellent work. I have a question. Why do we need to repeat the latent variable (latent = latent.reshape((latent.shape[0], -1)).unsqueeze(1).repeat(1, inject_index, 1)) by repeating it 14 times? Why...

Why does unet_use_temporal_attention and unet_use_temporal_attention are always None or False? It seems not woking in temporal attention. Does anyone know about this? Thx!

Hi there, I noticed that the AR and NAR models in this repository are trained separately. I'm curious to know why this approach was taken. Is it to save memory...