Deep_Metric
Deep_Metric copied to clipboard
For the losses in your code
Thanks for sharing your great work!
I read some questions and your answers about the losses, but I ask again to make it sure.
For the losses in your code, I see a similar type of losses such as
neg_loss = 2.0/self.alpha * torch.log(1 + torch.sum(torch.exp(self.alpha * (neg_pair - base))))
from the SemiHard loss, DistWeighted loss. As far as I know, those losses were a linear loss of the distance(in your code, similarity) according to the original papers. (If those losses are from the FaceNet and Sampling matteres...)
Are these losses your own 'Weight' loss which has an assumption that the data follows the distribution of the mixture of Gaussian?
Thanks :D
No such assumption is needed. Weight loss is based on th idea of hard mining : to focus on harder samples, no matter positive or negative pairs.