MLServer icon indicating copy to clipboard operation
MLServer copied to clipboard

An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more

Results 304 MLServer issues
Sort by recently updated
recently updated
newest added
trafficstars

Discussion points fromGraduation of Alibi-detect runtime in MLServer

When trying to infer on mlserver I get the error below, I think mlserver is failing to convert the InferenceRequest to mlflow compatible input. `mlflow.exceptions.MlflowException: Expected input to be DataFrame...

The xgboost models from `model_dir` are not loading, throws an error `Invalid model uri provided`. It's working if we provide `model_uri`. The sklearn model are loading properly without `model_uri`. xgboost...

in order to provide feature completion with the current implementation of seldon core python v1 servers (service orchestrator) it is required to support cloudevent headers, which would include the standard...

Ported from https://github.com/SeldonIO/seldon-core/issues/1525

I have ported an existing seldon-core model service into mlserver but found the performance dropped a lot. Below is my load test config and result of Rest endpoint seldon-core 2...