MLServer
MLServer copied to clipboard
An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and more
If the `parameters.uri` field in `model-settings.json` refers to a directory, `mlserver.utils.get_model_uri` can search for filename's with given names (via the `wellknown_filenames` kwarg). Since these filenames must be explicit named, only...
Starting the MLServer container with the wrong GID may cause permission errors accessing mounted file systems.
Hi, I was trying to parallelize model inference off a Kafka topic with multiple server instances. I couldn't get it working until I modified MLServer to receive configuration that sets...
When load testing an MLServer (deployed on AWS EKS with SC-V2) with [this setup](https://gist.github.com/edfincham/4b7e33a2685e8350e74dae3be54937bb) I get the following error whenever the size of the batches in my load tests exceeds...
I've been trying to deploy ML model locally with MLServer. My model is saved to an MLFlow model registry. So I start up my server as follows - ``` export...
Hi all, I'm using `mlflow` with `mlserver`, and a model with `pandas.DataFrame` input schema, with a colume of `string` type. However, it doesn't look like supported by `mlserver.codecs.pandas`: ``` [nav]...
Some models in HuggingFace hub needs `trust_remote_code` to be put at `True` in order to run these models. For example trying to run this model with MlServer https://huggingface.co/tiiuae/falcon-7b-instruct we obtain...
This PR introduces a new tracing provider, enabling dynamic attachment of native (BPF, Systemtap) probes at runtime. The goal of the provider is to allow correlation between MLServer-specific events and...
Building custom runtimes using `mlserver build` is fantastic but also leverages conda and some of our builds have been known to take up to 8 hours due to the conda...
## Issue Loading of tree based models in the new white-box explainer runtime `SKLearnRuntime` (added in https://github.com/SeldonIO/MLServer/pull/1279) could be improved. In `SKLearnRuntime._get_inference_model`, `lightgbm` and `sklearn` models must be loaded from...