LDU
LDU copied to clipboard
Question about dissimilar_loss
Hi, thanks for your excellent work! In the dissimilar_loss
loss = -1 * torch.mean(torch.cdist(protos, protos))
What is calculated here is the distance between A and A, which is always 0. Does it seem inconsistent with the paper?
Thank you for your comment. torch.cdist(a,a) calculates distance between each pair of the two collections of row vectors. So it will output 0 on the diagonal of the output matrix, but not necessarily elsewhere.
Yeah, it's right. It may be a problem with the torch version, cdist() calculates dim=2 by default, and then the omega initialization is (1, out, int, 1, 1). It will calculate the last two dimensions by default so it is 0. I hope that the author and subsequent researchers can pay attention to this issue. self.omega = nn.Parameter(torch.Tensor(1, out_features, in_features, 1, 1))