CTPN icon indicating copy to clipboard operation
CTPN copied to clipboard

Actual value of the offset regression loss

Open YangJiao1996 opened this issue 6 years ago • 1 comments

Hello there! I recently implemented the side-refinement part of CTPN on TensorFlow. During training, I noticed that the regression loss of side-refinement offset part (Lore) is relatively quite smaller than other parts. For example, Lvre is often 100 to 200 times bigger than Lore. I wonder whether my implementation of the side-refinement part is right or not...

YangJiao1996 avatar Apr 28 '18 09:04 YangJiao1996

hmm...in my implementation of me, Lv(re) is often 50 to 100 times smaller than Lo(re). My way is as follows: 55678381-e8bd7000-5922-11e9-8008-0dcbf8ec8980 Example: the yellow bbox is ground truth( gt), the black box(bl) is considering anchor and I calculate xside(of bl)=(x_leftside,x_rightside):(wa=16)

  • x_leftside=d1/16(dark green arrow)
  • x_rightside=d2/16(bright blue arrow) (d1,d2 is distance(x-axis) of gt center of anchor to the side left and right, respectively )

hcnhatnam avatar Apr 07 '19 04:04 hcnhatnam