xgboost
xgboost copied to clipboard
clarification needed for model/saving loading
A model saved via XGBoosterSaveModelToBuffer can be loaded by creating a booster object with the parameter model_buffer, however, attempting to load this model into a booster object using XGBoosterLoadModelFromBuffer gives an error such as
[14:09:10] /workspace/srcdir/xgboost/src/learner.cc:764: Check failed: mparam_.num_feature != 0 (0 vs. 0) : 0 feature is supplied. Are you using raw Booster interface?
Stack trace:
[bt] (0) /home/expandingman/.julia/artifacts/dcee79537e0e0f3f2ef6acf4b886a1dd6adcc6c8/lib/libxgboost.so(+0x660dd4) [0x7fdd0aa60dd4]
[bt] (1) /home/expandingman/.julia/artifacts/dcee79537e0e0f3f2ef6acf4b886a1dd6adcc6c8/lib/libxgboost.so(xgboost::LearnerConfiguration::ConfigureNumFeatures()+0x2af) [0x7fdd0aa74eff]
[bt] (2) /home/expandingman/.julia/artifacts/dcee79537e0e0f3f2ef6acf4b886a1dd6adcc6c8/lib/libxgboost.so(xgboost::LearnerConfiguration::Configure()+0x535) [0x7fdd0aa82605]
[bt] (3) /home/expandingman/.julia/artifacts/dcee79537e0e0f3f2ef6acf4b886a1dd6adcc6c8/lib/libxgboost.so(xgboost::LearnerImpl::Predict(std::shared_ptr<xgboost::DMatrix>, bool, xgboost::HostDeviceVector<float>*, int, int, bool, bool, bool, bool, bool)+0x69) [0x7fdd0aa7a769]
[bt] (4) /home/expandingman/.julia/artifacts/dcee79537e0e0f3f2ef6acf4b886a1dd6adcc6c8/lib/libxgboost.so(XGBoosterPredictFromDMatrix+0x2be) [0x7fdd0a77c62e]
[bt] (5) [0x7fdeda52bc64]
[bt] (6) [0x7fdeda52bcc4]
[bt] (7) /opt/julia/bin/../lib/julia/libjulia-internal.so.1.10(ijl_apply_generic+0x2ae) [0x7fdeda846a0e]
[bt] (8) [0x7fdeda50c2ae]
This may be a bug in XGBoosterLoadModelFromBuffer, but it seems more likely that I just don't understand how to properly use that function. This comment in the python wrapper would seem to suggest that this is not the intended use of XGBoosterLoadModelFromBuffer. At the very least, I find this very confusing since XGBoosterSaveModelToBuffer and XGBoosterLoadModelFromBuffer sure sound like they are inverse of each other. I think some clarification in the docs would be useful (I didn't see one but I might be missing it).
For context, I'm one of the maintainers of the Julia wrapper, and confusion over serialization methods has popped up a number of times, most recently here.
It should be the right way. I don't know why there's an error(assuming you have trained the model before serialization). Will look into it, is there an easy way that I can reproduce?
Sorry, I seem to have gotten myself confused somehow... I thought I had reproduced the problem and indeed the error above was real, but today I went to try to reproduce it to report back here and I can't. At this point I'm not sure if this is a mistake on my end, an intermittent (or hardware dependent) issue in libxgboost or a bug in the Julia wrapper. Now I am kicking myself for not saving exactly what I did to get the error above. I will try to think of some other way of reproducing the error and report back here if I can find it.
@ExpandingMan Any updates on this issue?
Feel free to reopen if there's a way to reproduce.
Sorry for not responding, yeah never ran into this again, don't know what I did.