alexaatm
alexaatm
Update: **adding manually 0.1 instead of `self.interpolate_offset` solved the issue**. This means the pretrained checkpoint has it set to 0.0, I assume.
Hi @vladchimescu @rbareja25 I faced the same issue with segmentation based on dino vs dinov2 features. Then another look up in dinov2 paper reveals the authors are aware of such...
Hi! I stumbled on the same issue - the output of my attentions is a tensor with 3 dimensions: `attentions.shape=torch.Size([1, 329, 384])`. Checked that this happens with both xformers and...
Update: confirmed that it happens because of xformers enabled. Before I must haved overlooked it.. Solved now:)