LDU icon indicating copy to clipboard operation
LDU copied to clipboard

Question about dissimilar_loss

Open Cocofeat opened this issue 2 years ago • 2 comments

Hi, thanks for your excellent work! In the dissimilar_loss

loss = -1 * torch.mean(torch.cdist(protos, protos))

What is calculated here is the distance between A and A, which is always 0. Does it seem inconsistent with the paper?

Cocofeat avatar Nov 11 '22 14:11 Cocofeat

Thank you for your comment. torch.cdist(a,a) calculates distance between each pair of the two collections of row vectors. So it will output 0 on the diagonal of the output matrix, but not necessarily elsewhere.

xuanlongORZ avatar Nov 11 '22 14:11 xuanlongORZ

Yeah, it's right. It may be a problem with the torch version, cdist() calculates dim=2 by default, and then the omega initialization is (1, out, int, 1, 1). It will calculate the last two dimensions by default so it is 0. I hope that the author and subsequent researchers can pay attention to this issue. self.omega = nn.Parameter(torch.Tensor(1, out_features, in_features, 1, 1))

Cocofeat avatar Nov 12 '22 04:11 Cocofeat