river icon indicating copy to clipboard operation
river copied to clipboard

Ordinal regression

Open MaxHalford opened this issue 8 months ago • 0 comments

  • https://pythonhosted.org/mord/

Loss functions

As anyone who reads this might know, there's classification and regression. There's a special case of regression that some people call "ordinal regression". This involves predicting a number (regression, duh), but on an ordinal discrete scale. For instance, in the IMdB dataset, the goal is to predict a rating that is in {0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10}. In other words, ordinal regression is a mix of classification and regression.

As is explained in this well-written paper, it's possible to design loss functions that behave better than the squared loss for such a task. I found this paper by looking at mord.

Anyway, this might be something we want to consider. This could potentially result in a better default loss function for the models in the reco module.

MaxHalford avatar Nov 01 '23 16:11 MaxHalford