scalable-pytorch-sinkhorn
scalable-pytorch-sinkhorn copied to clipboard
Function implicitly assumes the two pointclouds to have the same number of points
Hi,
Thanks for the great repo.
I have a question about the choice of w_x and w_y. If these two parameters are not assigned, they will be assigned uniform weights (as done here), however, this line multiplies the weight for y by the ratio of the number of points between the two pointclouds. If for example x have 1.5 more vertices than y, this test will be inevitably be broken!
Does this function assume the pointclouds to have a similar number of points? what is the motivation behind the scaling in L95?
Thank you in advance.
same question here
Just comment https://github.com/fwilliams/scalable-pytorch-sinkhorn/blob/main/sinkhorn.py#L95
w_x = torch.ones(x.shape[0]).to(x) / x.shape[0]
w_y = torch.ones(y.shape[0]).to(x) / y.shape[0]
# w_y *= (w_x.shape[0] / w_y.shape[0])
previous lines already divide sum of weights uniformly, doesn't need to multiply by the ratio.
自动回复: 您好,我是卢大瑞。我已经收到您的来信,谢谢。但是请注意,并不表明我已经阅读了您的邮件。祝好卢大瑞