ssd.pytorch
ssd.pytorch copied to clipboard
About ohem
loss_c = log_sum_exp(batch_conf) - batch_conf.gather(1, conf_t.view(-1, 1))
I'm confused with this line of code. Why not using conf_data to select hard examples? What is this loss_c stand for?
@IssacCyj I'm confused with this line of code too. Do you understand why use loss_c instead of conf_data here now?
I'm confused about this process, too. Maybe someone can explain this.
I'm also confused.
I don't understand why author minus below term.
- batch_conf.gather(1, conf_t.view(-1, 1))
loss_c = log_sum_exp(batch_conf) - batch_conf.gather(1, conf_t.view(-1, 1))
I guess this author considers log_sum_exp
part as an approximation to maximum function, which represents predicted confidence.
And batch_conf.gather(1, conf_t.view(-1, 1))
just pick out target classes' confidence.
It's just my guess, but you can change it to torch.nn.functional.cross_entropy(batch_conf, conf_t.view(-1), reduction='none')
or the like. Although the eventual rank would be a little different.