conifer
conifer copied to clipboard
xgboost precision
Hi,
I found an issue with xgboost example https://github.com/thesps/conifer/blob/master/examples/xgboost_to_hls.py
y_hls
and y_xgb
aren't close
y_hls = expit(model.decision_function(X_test))
y_xgb = bst.predict(dtest)
diff = y_xgb - y_hls
print(diff[abs(diff)>0.05])
[-0.13502171 0.06955624 -0.1099674 -0.2427507 -0.14311438 -0.0606428
0.08703702 -0.054607 -0.41907781 -0.12813512 0.28282228 -0.21637464
0.31876776 0.26711339 -0.14989728 -0.05887845 -0.06809392 0.12303647
-0.08492118 -0.07751923 -0.05739652 -0.11599926 -0.14425865 -0.08459726
-0.12540119 -0.06227853 -0.27874367 -0.29141373 0.12563779 -0.22311496
-0.13287621 -0.17924546 -0.10041202]
As soon as output is normalized to 1, absolute error up to 0.31 seems to be too high for practical usage. Is it a known issue?
Hi, these examples are not at all optimized in terms of the precision used, which can have an effect on numerical accuracy. Did you try any tuning?
Nope, just vanilla example out of the box