ml-glossary icon indicating copy to clipboard operation
ml-glossary copied to clipboard

corss entopy loss is different from hinge loss

Open maryamag85 opened this issue 6 years ago • 1 comments

confusing definition for cross-entropy loss referring to Stanford lecture notes http://cs231n.github.io/linear-classify/

you are calling log loss same as cross entropy loss

maryamag85 avatar Jul 30 '19 16:07 maryamag85

You can see log loss as a special case (with only 2 classes) of entropy loss.

xinbinhuang avatar Feb 04 '20 07:02 xinbinhuang