smartcore
smartcore copied to clipboard
Consider implementing XGBoost
I'm submitting a
- [x] feature request.
Expected Behaviour:
smartcore
should provide an implementation for gradient boosting (XGBoost) for most popular regressions:
- Some resources here.
- C++ bindings. It looks like there is still no pure-Rust implementation
-
forust-ml
gradient boosting library
Let's try to find a low foot-print and feature-enabled solution to provide this great tool.
Hi @Mec-iS could you please share your update on this thread? Have you decided how to implement this? Or should we directly use the C++ bindings
Hi,
At the moment this is not in our priority list as it doesn't seems that users are keen to have this feature. I would be happy to follow if somebody decides to take this up. First of all we need a list of use cases to define when/how users may want to use this (possibly using the existing datasets). We don't want to provide anything through bindings, there are already other libraries that do that pretty well; what put in there library has to go through some criteria that are loosely defined in CONTRIBUTING and DEVELOPERS.
If you have any idea please share.