CTPN
CTPN copied to clipboard
Actual value of the offset regression loss
Hello there! I recently implemented the side-refinement part of CTPN on TensorFlow. During training, I noticed that the regression loss of side-refinement offset part (Lore) is relatively quite smaller than other parts. For example, Lvre is often 100 to 200 times bigger than Lore. I wonder whether my implementation of the side-refinement part is right or not...
hmm...in my implementation of me, Lv(re) is often 50 to 100 times smaller than Lo(re).
My way is as follows:
Example: the yellow bbox is ground truth( gt), the black box(bl) is considering anchor and I calculate xside(of bl)=(x_leftside,x_rightside):(wa=16)
- x_leftside=d1/16(dark green arrow)
- x_rightside=d2/16(bright blue arrow) (d1,d2 is distance(x-axis) of gt center of anchor to the side left and right, respectively )