MLServer installation fails on macOS due to tritonclient 2.61+ perf-analyzer dependency
Problem
MLServer 1.7.1 cannot be installed on macOS when poetry/pip resolves to tritonclient >= 2.61.0, because tritonclient 2.61+ introduced perf-analyzer as a hard dependency, which only provides Linux wheels (no macOS support).
Error
RuntimeError: Unable to find installation candidates for perf-analyzer (2.59.1)
Root Cause
- MLServer requires:
tritonclient >= 2.42with[http]extras - tritonclient 2.61.0+ added
perf-analyzeras a mandatory dependency - perf-analyzer only distributes
manylinuxwheels (no macOS/Darwin wheels available) - This blocks all macOS users from installing MLServer with default dependency resolution
Impact
- Prevents local development on macOS
- Forces developers to use Docker/Linux VMs for basic MLServer development
- Affects the developer experience for a significant portion of the ML community
Current Workaround
Pin tritonclient to pre-2.61 versions:
tritonclient = {version = ">=2.42,<2.61", extras = ["http"]}
Environment
- OS: macOS (Darwin)
- MLServer version: 1.7.1
- tritonclient version: 2.61.0
- Python: 3.11
Sharing in case someone else gets the same issue installing mlserver on mac
Considering mlserver is already being constrained due to issues installating triton on mac:
# add a min version to tritonclient due to https://github.com/triton-inference-server/server/issues/6246
tritonclient = {version = ">=2.42", extras = ["http"]}
does it make sense add the >=2.42,<2.61 constrain?
If it sounds good to you I can create a PR
@yesid-lopez we have also experienced this. If you can find a fix please do! We were installing mlserver 1.7.1 in python 3.12 using pdm. Same error - just testing if using a docker image (linux based) will fix for local development for now.
And can confirm that using docker to install very same requirements works fine from a Mac host machine.
And here is the discussion on getting that requirement removed or made extra in tritonclient https://github.com/triton-inference-server/client/issues/856
And an open PR to remove it entirely https://github.com/triton-inference-server/client/pull/857