bottom-up-attention
bottom-up-attention copied to clipboard
Why choose "SoftmaxWithLoss" for "loss_attr"
Hi @peteanderson80 , Thanks for your sharing.
There are some objects has multiple attributes, why choose "SoftmaxWithLoss" for "loss_attr"?
Hi, did anybody figure out the reason about the loss function?
Actually, the operation of this loss function is what we want. “SoftmaxWithLoss” was rewrited. You can check the souce code at this linkhttps://github.com/peteanderson80/bottom-up-attention/blob/master/caffe/src/caffe/layers/softmax_loss_layer.cpp
Thanks! So what they implemented is something like softmaxBinaryCrossEntorpy
. Did I get it right?
yeah, this rewrited function try to maximize the sum of all attribures' score .