MLServer icon indicating copy to clipboard operation
MLServer copied to clipboard

Python client library for mlflow server

Open idlefella opened this issue 1 year ago • 1 comments

Hello everyone,

I was exploring using mlserver to deploy ML models as a REST service. I noticed an issue: if you plan to use mlserver with Python and want to utilize its codecs (like numpy or pandas codecs), you must include mlserver in your codebase as a dependency. This action introduces many transitive dependencies, such as fastapi, aiokafka, uvicorn, etc., which significantly increases the size of the dependencies. Would it not be more practical to have a separate mlserver-client package that exclusively contains the codecs and types?

Or how do you currently integrate mlserver with another microservice? Do you manually create the Open Inference Protocol v2 JSON?

idlefella avatar Aug 22 '24 11:08 idlefella

Has this ever been explored since? We're currently maintaining an internal package that just contains a few key codecs, settings and errors we use, but an official modularization would be much nicer.

Joelius300 avatar Jun 04 '25 12:06 Joelius300