slot-attention-pytorch icon indicating copy to clipboard operation
slot-attention-pytorch copied to clipboard

Pytorch Implementation of paper "Object-Centric Learning with Slot Attention"

Results 5 slot-attention-pytorch issues
Sort by recently updated
recently updated
newest added

The gradient with respect to the `slots_mu` and `slots_sigma`variables is zero. To learn the initialization of slots, you could change your `model.py`in line 40 to `slots = torch.distributions.Normal(mu, sigma).rsample()`... with...

Hello and thank you for the great work. I think the eval file misses the pre-trained model ('./tmp/model3.ckpt') as indicated in the error I received. Would you please show me...

Hi, Just wanted to confirm if shape of image is `[batch_size, num_channels, width, height]`, usually it's `[batch_size, num_channels, height, width]`. https://github.com/evelinehong/slot-attention-pytorch/blob/1518c23c312f93b961f8c071136f0093b741863a/model.py#L176 Thanks and regards

Resolved the non-trainability of mu and sigma through the reparameterization trick. Also replaced sigma with log(sigma) for numerical stability.

Thanks for your implementation of Slot Attention module. However, I found that the sampling operation (in Line 40 at model.py) prevents gradients from the back-propagation. During training, the gradients of...