az ml model package Dependency Conflict in Service Principaled managed Docker
Describe the bug
We are trying to package our model registered in our ml workspace and are encountering an error in the prepare_image job created by the service principle (managed on your end). The error is on step 4/4 in your docker [compose]. I have tried all sort of variations in our base env (image) but currently reduced it to bare bones. It looks like the code snippet below. We are using the parent image: mcr.microsoft.com/azureml/openmpi5.0-ubuntu24.04 however we've tried several with the result being the same.
I have also tried to explicitly set the azureml-inference-server-http version in our conda.yml file but it was unsuccessful because this versioning is set in your docker file and apparently cannot be overwritten by what we set in our file. Could you please look into this inference versioning issue. I believe you should be using a much more recent version as this one is quite out of date (2022). The problem is that I don't think python 3.12 is supported by the version of the azureml-inference-server-http according to the pypi only 3.10 is mentioned.
`channels:
- conda-forge dependencies:
- python=3.12
- pip=24.2 name: ubuntu_ae_model_base`
Related command
az ml model package
--name $(model_name)
--version $(model_version)
--file "config.yml"
--resource-group "resource_group_name"
--workspace-name "workspace_name"
--debug
Errors
ERROR: process "/bin/sh -c pip install azureml-inference-server-http~=0.8.0" did not complete successfully: exit code: 1
Issue script & Debug output
2025-05-02T09:45:31: #7 [4/4] RUN pip install azureml-inference-server-http~=0.8.0 2025-05-02T09:45:31: #7 0.639 Collecting azureml-inference-server-http~=0.8.0 2025-05-02T09:45:32: #7 0.660 Downloading azureml_inference_server_http-0.8.4.2-py3-none-any.whl.metadata (12 kB) 2025-05-02T09:45:32: #7 0.695 Collecting flask<2.3.0 (from azureml-inference-server-http~=0.8.0) 2025-05-02T09:45:32: #7 0.698 Downloading Flask-2.2.5-py3-none-any.whl.metadata (3.9 kB) 2025-05-02T09:45:32: #7 0.725 Collecting flask-cors~=3.0.1 (from azureml-inference-server-http~=0.8.0) 2025-05-02T09:45:32: #7 0.728 Downloading Flask_Cors-3.0.10-py2.py3-none-any.whl.metadata (5.4 kB) 2025-05-02T09:45:32: #7 0.751 INFO: pip is looking at multiple versions of azureml-inference-server-http to determine which version is compatible with other requirements. This could take a while. 2025-05-02T09:45:32: #7 0.752 Collecting azureml-inference-server-http~=0.8.0 2025-05-02T09:45:32: #7 0.755 Downloading azureml_inference_server_http-0.8.4.1-py3-none-any.whl.metadata (12 kB) 2025-05-02T09:45:32: #7 0.777 Downloading azureml_inference_server_http-0.8.4-py3-none-any.whl.metadata (12 kB) 2025-05-02T09:45:32: #7 0.792 Downloading azureml_inference_server_http-0.8.3-py3-none-any.whl.metadata (12 kB) 2025-05-02T09:45:32: #7 0.805 Downloading azureml_inference_server_http-0.8.2-py3-none-any.whl.metadata (12 kB) 2025-05-02T09:45:32: #7 0.818 Downloading azureml_inference_server_http-0.8.1-py3-none-any.whl.metadata (11 kB) 2025-05-02T09:45:32: #7 0.830 Downloading azureml_inference_server_http-0.8.0-py3-none-any.whl.metadata (9.9 kB) 2025-05-02T09:45:32: #7 0.841 ERROR: Cannot install azureml-inference-server-http==0.8.0, azureml-inference-server-http==0.8.1, azureml-inference-server-http==0.8.2, azureml-inference-server-http==0.8.3, azureml-inference-server-http==0.8.4, azureml-inference-server-http==0.8.4.1 and azureml-inference-server-http==0.8.4.2 because these package versions have conflicting dependencies. 2025-05-02T09:45:32: #7 0.841 2025-05-02T09:45:32: #7 0.841 The conflict is caused by: 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.4.2 depends on inference-schema~=1.7.0 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.4.1 depends on inference-schema~=1.5.0 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.4 depends on inference-schema~=1.5.0 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.3 depends on inference-schema~=1.5.0 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.2 depends on inference-schema~=1.5.0 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.1 depends on inference-schema~=1.5.0 2025-05-02T09:45:32: #7 0.841 azureml-inference-server-http 0.8.0 depends on inference-schema~=1.5.0 2025-05-02T09:45:32: #7 0.841 2025-05-02T09:45:32: #7 0.841 To fix this you could try to: 2025-05-02T09:45:32: #7 0.841 1. loosen the range of package versions you've specified 2025-05-02T09:45:32: #7 0.841 2. remove package versions to allow pip to attempt to solve the dependency conflict 2025-05-02T09:45:32: #7 0.841 2025-05-02T09:45:32: #7 0.915 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts 2025-05-02T09:45:32: #7 ERROR: process "/bin/sh -c pip install azureml-inference-server-http~=0.8.0" did not complete successfully: exit code: 1
Expected behavior
This step should success based on a version of azureml-inference-server-http that is compatible with your other dependencies.
Environment Summary
azure-cli 2.69.0 *
core 2.69.0 * telemetry 1.1.0
Extensions: azure-devops 1.0.1 ml 2.36.5
Dependencies: msal 1.31.2b1 azure-mgmt-resource 23.1.1
Python location '/opt/az/bin/python3' Config directory '/home/AzDevOps/.azure' Extensions directory '/opt/az/azcliextensions'
Python (Linux) 3.12.8 (main, Feb 5 2025, 06:39:23) [GCC 11.4.0]
Additional context
No response
Thank you for opening this issue, we will look into it.
Thanks for the feedback! We are routing this to the appropriate team for follow-up. cc @azureml-github.
any update on this? It seems that default docker-tools image that you use for prepare_image has python 3.10 hard coded in it's docker file. Is this impacting possibly the library versions that are set then in the docker file specific to model packaging then?
Do you have any updates on this? An option to specify the version of the inference server used in the package would be great.