DecisionTree.jl icon indicating copy to clipboard operation
DecisionTree.jl copied to clipboard

Add support for specifying the `loss` used in random forests and AdaBoost model

Open ablaom opened this issue 2 years ago • 4 comments

As far as I can tell, the loss parameter is only exposed for single trees. I think this would be pretty easy to add to the ensemble models.

Issue raised at #211.

ablaom avatar Feb 12 '23 22:02 ablaom

Also, it seems that loss is only available for classification trees - not regression trees.

Is it possible to repurpose the existing code for classification trees to run regression tasks? It would be convenient both for

  • regression tasks with one target and a custom loss, and

  • multi-target problems (the current implementation for regression trees does not allow for features that are not Float64 - i.e., single targets).

fipelle avatar Mar 27 '23 03:03 fipelle

multi-target problems (the current implementation for regression trees does not allow for features that are not Float64 - i.e., single targets).

Do you mean features here or, rather, labels (aka target)?

ablaom avatar Mar 27 '23 20:03 ablaom

labels as in this example

fipelle avatar Mar 27 '23 23:03 fipelle

Right. Your interesting question is a little orthogonal to initial post, so addressing it here

ablaom avatar Mar 29 '23 21:03 ablaom