DIoU-pytorch-detectron icon indicating copy to clipboard operation
DIoU-pytorch-detectron copied to clipboard

Weighting of DIOU/ CIOU loss

Open AhmedMagdyHendawy opened this issue 4 years ago • 1 comments

Hello, Could you please explain why did you normalize the loss at the end by both:

  • bbox_outside_weights
  • output.size(0) ?

iou_weights = bbox_inside_weights.view(-1, 4).mean(1) * bbox_outside_weights.view(-1, 4).mean(1) iouk = ((1 - iouk) * iou_weights).sum(0) / output.size(0) diouk = ((1 - diouk)).sum(0)

What I can understand from the whole source code that each of the bbox_outside_weights components is 1/num_examples as in

# Bbox regression loss has the form:
#   loss(x) = weight_outside * L(weight_inside * x)
# Inside weights allow us to set zero loss on an element-wise basis
# Bbox regression is only trained on positive examples so we set their
# weights to 1.0 (or otherwise if config is different) and 0 otherwise
bbox_inside_weights = np.zeros((num_inside, 4), dtype=np.float32)
bbox_inside_weights[labels == 1, :] = (1.0, 1.0, 1.0, 1.0)

# The bbox regression loss only averages by the number of images in the
# mini-batch, whereas we need to average by the total number of example
# anchors selected
# Outside weights are used to scale each element-wise loss so the final
# average over the mini-batch is correct
bbox_outside_weights = np.zeros((num_inside, 4), dtype=np.float32)
# uniform weighting of examples (given non-uniform sampling)
num_examples = np.sum(labels >= 0)
bbox_outside_weights[labels == 1, :] = 1.0 / num_examples
bbox_outside_weights[labels == 0, :] = 1.0 / num_examples

in /lib/roi_data/rpn.py

AhmedMagdyHendawy avatar Feb 10 '21 18:02 AhmedMagdyHendawy

Is it another kind of scaling for the loss?

bbox_outside_weights[labels == 1, :] = 1.0 / num_examples bbox_outside_weights[labels == 0, :] = 1.0 / num_examples

AhmedMagdyHendawy avatar Feb 10 '21 18:02 AhmedMagdyHendawy