Alex Rogozhnikov

Results 187 comments of Alex Rogozhnikov

Hello Dan, here is how weight prediction is implemented ```python In [2]: GBReweighter.predict_weights?? Signature: GBReweighter.predict_weights(self, original, original_weight=None) Source: def predict_weights(self, original, original_weight=None): """ Returns corrected weights. Result is computed as...

This is a frequent question (or family of questions) from physicists, who are interested in applying reweighting to one more data sample. Below I give solutions for different situations. ##...

Hi @kpedro88 Your analysis is correct - only leaf id predicted by the tree is important, not leaf values; leaf values that are stored separately then used, `(tree, leaf_values)`. So,...

Hi Gino, for uBoost convergence is something poorly defined. - first, uBoost has no optimization target (contrary say to AdaBoost, GBDT, GB+FL) - second, the way it operates quite often...

Hi Martha, negative weights aren't friendly towards ML because of driving to non-convex unbounded optimization, so you should not expect those to work right for ML models (sometimes they do,...

@alexpearce Hey Alex, I don't think it is so different for trees. Things may go arbitrarily bad in very simple situations: ```python reg = GradientBoostingRegressor(n_estimators=100, max_depth=1).fit(numpy.arange(2)[:, None], numpy.arange(2), sample_weight=[-0.9999999999, 1])...

@alexpearce Well, in such case you should check for sum in each particular leaf of the tree (since we aggregating over samples in a leaf). I see potential complains like...

Have a look here: http://stackoverflow.com/questions/21342931/error-importing-theano If it still doesn't work, you can ask at theano user group or at stackoverflow. Or try conda.

@Falengord did you resolve the problem?

Well, it is bad, but expected behavior. So far weighted quantiles are computed, which doesn't work fine with non-continous columns. You can pass explicitly the edges to use for this...