MLServer
MLServer copied to clipboard
Improve loading of white-box explainers' models, and add catboost support
Issue
Loading of tree based models in the new white-box explainer runtime SKLearnRuntime (added in https://github.com/SeldonIO/MLServer/pull/1279) could be improved.
In SKLearnRuntime._get_inference_model, lightgbm and sklearn models must be loaded from joblib files (since we require sklearn api models), whilst xgboost models can be loaded from their native bst format. We could possibly make things more robust by using the model's respective runtime (e.g. mlserver_lightgbm) to the load the model. However, this requires us knowing what model library generated a given artefact, possibly from a new model_flavour parameter, or via a model-settings.json in the model directory itself.
Additionally, catboost model support was left out of https://github.com/SeldonIO/MLServer/pull/1279, and needs including.
Related
See https://github.com/SeldonIO/MLServer/pull/1279#discussion_r1252798610 for original discussion.