slot-attention-pytorch
slot-attention-pytorch copied to clipboard
Added reparam trick and log-sigma
Resolved the non-trainability of mu and sigma through the reparameterization trick. Also replaced sigma with log(sigma) for numerical stability.