Deep_Metric icon indicating copy to clipboard operation
Deep_Metric copied to clipboard

For the losses in your code

Open ghost opened this issue 6 years ago • 1 comments

Thanks for sharing your great work! I read some questions and your answers about the losses, but I ask again to make it sure.
For the losses in your code, I see a similar type of losses such as neg_loss = 2.0/self.alpha * torch.log(1 + torch.sum(torch.exp(self.alpha * (neg_pair - base)))) from the SemiHard loss, DistWeighted loss. As far as I know, those losses were a linear loss of the distance(in your code, similarity) according to the original papers. (If those losses are from the FaceNet and Sampling matteres...) Are these losses your own 'Weight' loss which has an assumption that the data follows the distribution of the mixture of Gaussian? Thanks :D

ghost avatar Feb 13 '19 18:02 ghost

No such assumption is needed. Weight loss is based on th idea of hard mining : to focus on harder samples, no matter positive or negative pairs.

bnu-wangxun avatar Feb 17 '19 13:02 bnu-wangxun