multi-adaboost icon indicating copy to clipboard operation
multi-adaboost copied to clipboard

discrete_boost/learning_rate

Open a-taherkhani opened this issue 7 years ago • 1 comments

When learning rate is changed to one number greater than 1, the Adaboost accuracy using 'SAMM' algorithm is reduced compared to original 'sklearn.ensemble.AdaBoostClassifier' algorithm. I find the difference in the 'discrete_boost' function. the equation is revised as follows: estimator_weight = self.learning_rate_ * (np.log((1. - estimator_error) / estimator_error) + np.log(self.n_classes_ - 1.))

a-taherkhani avatar Jan 29 '18 15:01 a-taherkhani

You mean the 1 and 1. will have different result when the learning rate is greater than 1? In my test, there is no difference, so could you please provide more details? By the way, do you use py2 or py3?

jinxin0924 avatar Jan 30 '18 02:01 jinxin0924