pytorch-metric-learning
pytorch-metric-learning copied to clipboard
Inconsistency in default parameters of Triplet Loss and Triplet Loss miner
Hi,
In TripletMarginLoss
you have default margin set to 0.05:
class TripletMarginLoss(BaseMetricLossFunction):
"""
Args:
margin: The desired difference between the anchor-positive distance and the
anchor-negative distance.
swap: Use the positive-negative distance instead of anchor-negative distance,
if it violates the margin more.
smooth_loss: Use the log-exp version of the triplet loss
"""
def __init__(
self,
margin=0.05,
swap=False,
smooth_loss=False,
triplets_per_anchor="all",
**kwargs
):
While in TripletMarginMiner
it is 0.2
class TripletMarginMiner(BaseTupleMiner):
"""
Returns triplets that violate the margin
Args:
margin
type_of_triplets: options are "all", "hard", or "semihard".
"all" means all triplets that violate the margin
"hard" is a subset of "all", but the negative is closer to the anchor than the positive
"semihard" is a subset of "all", but the negative is further from the anchor than the positive
"easy" is all triplets that are not in "all"
"""
def __init__(self, margin=0.2, type_of_triplets="all", **kwargs):
super().__init__(**kwargs)
self.margin = margin
self.type_of_triplets = type_of_triplets
self.add_to_recordable_attributes(list_of_names=["margin"], is_stat=False)
self.add_to_recordable_attributes(
list_of_names=["avg_triplet_margin", "pos_pair_dist", "neg_pair_dist"],
is_stat=True,
)
I think it's better to set them to some common value, because I guess that most of people want to use them out of box
Or maybe I'm mistaken and it is intentional to mine "hardest" samples?
I think the miner margin was set based on some paper. I don't know about the loss margin. Setting them to the same value could be reasonable.