Iaroslav Zeigerman

Results 22 comments of Iaroslav Zeigerman

Hm, good question. I briefly look into the code and couldn't spot any obvious sources of non-determinism.

Hey @chris-smith-zocdoc! This is some great investigation and PoC 👍 There are a few things I'd like to understand better: 1. Can this just be solved with one-hot encoding? How...

@chris-smith-zocdoc sorry for taking so long. I'm still thinking this through. Although the proposal sounds reasonable I'm still not sure what to do about C.

@skjerns thanks for reporting this issue! This is indeed an interesting use case and the code generated by `m2cgen` in this scenario produces an array with class probabilities where classes...

Great, I'm glad it was helpful. By "manual" I meant to alter the `code` variable, not to edit the source file manually :) Sorry about the confusion.

@Mohamed-Rafik-Bouguelia Yeah, I am pretty sure. What makes you think otherwise? Do you perhaps have an example that proves the opposite? **UPD** I don't see how the mentioned comment is...

Hey @chris-smith-zocdoc, thanks for reporting this! I think support for `Booster` object is worth adding to `m2cgen`. As part of this effort I'd also suggest to add a direct `Booster`...

Thanks, @chris-smith-zocdoc! You can begin with the following lines: https://github.com/BayesWitnesses/m2cgen/blob/master/m2cgen/assemblers/boosting.py#L139 - for LightGBM https://github.com/BayesWitnesses/m2cgen/blob/master/m2cgen/assemblers/boosting.py#L86 - for XGBoost. This is where we're accessing the underlying `Booster` instances from `scikit-learn` compatible wrappers....

Hey @ehoppmann! Thank you so much for reporting this issue and for your kind feedback! When you say "transformed" do you mean a probability value between 0 and 1? If...

@ehoppmann I see, thank you for sharing! The current logic in `m2cgen` implies that the sigmoid should only be applied if `XGBClassifier` was passed. Which is clearly wrong based on...