DecisionTree.jl icon indicating copy to clipboard operation
DecisionTree.jl copied to clipboard

Obscenely slow prediction

Open tecosaur opened this issue 3 years ago • 6 comments

Hello,

I'd love to use DecisionTree.jl for a project I'm currently working on, as it's great in lot of ways. Speedy to train, players nicely with AbstractTrees, etc.

Unfortunately, saying the prediction performance is "not good" is putting things mildly. I did a test run with an simplified version of one of the data sets I'm working with, and recorded the training and prediction times of DecisionTree.jl as well as a number of other common random forest implementations.

Tool Train time Predict time Ratio
DecisionTree.jl 0.6s 175s 292
randomForest 24.4s 4.2s 0.17
ranger 1.9s 0.5s 0.26
sklearn 63s 1.7s 0.03

The competitiveness of the training time gives me hope that the DecisionTrees.jl should be able to be competitive with prediction performance too 🙂.

tecosaur avatar May 11 '22 11:05 tecosaur

Thanks for reporting. That high prediction time is intriguing.

Could you please provide enough detail here for others to reproduce the results, using publicly available data, or synthetic data?

ablaom avatar May 11 '22 22:05 ablaom

In that example above, I've run 1000 trees on a 1-dimensional binary classification data set with ~70,000 entries. It should be pretty easy to generate something like this.

If you have a look at https://github.com/tecosaur/TreeComparison and run julia bulkreport.jl a number of reports will be generated. See forest.jl for the code running each implementation. While the results aren't as extreme, the disparity in the predict/train time ratio is still quite apparent. For example, with the Iris data:

Tool Train time Predict time Ratio
DecisionTree.jl 0.01s 0.32s 32
randomForest 1.28s 0.6s 0.47
ranger 0.08s 0.03s 0.38
sklearn 1.12s 0.1s 0.09

tecosaur avatar May 12 '22 01:05 tecosaur

Ok, I've started looking into this, and I've identified at least two major sub-optimalities in the design. One is the implementation of tree evaluation/prediction, the other is the design of the Leaf struct.

I'm currently trying to replace Leaf with this structure:

struct Leaf{T, N}
    features :: NTuple{N, T}
    majority :: Int
    values   :: NTuple{N, Int}
    total    :: Int
end

Which should make a prediction with probability on a leaf O(1) instead of O(n). I am unfortunately finding a few parts of the code base hard to work with though, such as src/classification/tree.jl — functions with 18 positional arguments should be outlawed!

tecosaur avatar May 12 '22 09:05 tecosaur

I've just done a bit more than the bare minimum, and so far the prediction with probability performance improvement is 2-10x with a sample iris dataset and a large-ish unidimensional data set. See: https://tecosaur.com/public/treeperf.html

tecosaur avatar May 13 '22 06:05 tecosaur

This looks like progress to me. Do you think we could get away with marking the proposed change to the Leaf struct as non-breaking? As far as I can tell, this non-public API, and in any case, we are preserving the existing properties and first parameter.

Happy to review a PR.

ablaom avatar May 16 '22 01:05 ablaom

Ok, I was hoping to make further improvements, but it would probably be worth PRing the basic tree improvements I've made.

tecosaur avatar Jun 24 '22 02:06 tecosaur