Support loading lightgbm model from a file
Is your feature request related to a problem? Please describe. Our data scientists are training lightgbm models in python, and our inference runtime is in C#. We are very much interested in using ML.NET to run inference, however loading the model from a file is not yet supported in ML.NET. Is there an obstacle in adding this additional binding already available in lightgbm C++ API ?
Describe the solution you'd like Add LGBM_BoosterLoadModelFromString binding to WrappedLightGbmInterface () available in https://github.com/Microsoft/LightGBM/blob/master/include/LightGBM/c_api.h
Describe alternatives you've considered An alternative is to convert our models to ONNX and not use ML.NET runtime.
This isn't something we can do in the near future, as it requires us to update some of our existing APIs and use a more recent version of our LightGBM library, but it is something we will consider for the future. Thanks for the suggestion!
hey @prodanovic just wanted to let you know that this is now something that we are working on and actually are in the testing phase. If things go well should be in PR pretty soon. It is still on version 2.3.1 of LightGBM. The update to the latest version will be later this year.
@michaelgsharp PR #6569 is very nice and welcomed! Do you know if it should work with LightGBM 3.3.x models, or it would be likely to produce errors or even false results etc.? I could not find out from other comments. I am not up-to-date with this, but I somehow let myself believe the model format has not changed, only the code to produce the model.
Hey @torronen, I'm not sure. If the LightGBM model format itself hasn't changed, or even if things have changed but the major headers haven't changed, in theory it should work. That being said I haven't tested it at all. We will be updating to the latest version of LightGBM this year though, so if things don't work out it shouldn't be too long before you can do it.
There is one more minor thing I need to add to that PR before it can go in. Should get that done in the next few days and pushed.
@torronen actually, if you do test using the 3.3.x models after that pr goes in will you update this thread with the results of your test?
@michaelgsharp sure!