wing-loss
wing-loss copied to clipboard
Can you share the result?
I trained based on your code, while the result is far lower than the paper, can you share your result?
I also trained on this code. Can you tell what is the range of loss values while training??In my case it started from 98-100 and ends at 29-30.It's kind of odd as it reaches the min value within 20k iterations and then oscillating around that..Have you faced similar issue while training??
I also trained on this code. Can you tell what is the range of loss values while training??In my case it started from 98-100 and ends at 29-30.It's kind of odd as it reaches the min value within 20k iterations and then oscillating around that..Have you faced similar issue while training??
my loss is around 300 after 20k iteration, nme around 0.55, how about you?
@onzone I never succeeded to reach below 100 for wing loss, which dataset did you use? and how about the result?
I have used many dataset for different landmark localization..first I was training for 5 landmarks where I used CelebA..then used 300W for 68 landmark localization and WFLW for 98 landmark detection..Loss depends on the number of landmarks.. for 98, it never became under 800, for 68 it around 300 and for 5 landmarks, frankly I dont remember, most probably around 100 it was..
@onzone mine same, on the training set, the loss is around 300, and on the verification set, the loss is around 400. it is hard to reach the nme as mentioned in the original paper? did you successfully reach the result?
@eyiztan Hi, I'm trying to get the result the paper claimed, but I only get nme about 0.06 on 300W. I also noticed your discussion with the author of Wingloss at his github webpage. May I ask did you get a better result like the paper?
@eyiztan @onzone @TropComplique Did you find where the problem is? there is also some problem with my loss