Custom images launched in SageMaker Studio use Python installation not present in uploaded image
Expected Behavior
- I attach a custom docker container to my SageMaker Studio session (example Dockerfile)
- I launch a SageMaker Studio notebook using a Jupyter Kernel from the image
- The kernel launches the system Python installation at
/usr/local/binreferencing packages installed at/usr/local/lib/python3.8/site-packages
Οbserved behavior
The SageMaker Studio notebook launches a kernel session using a python installation not present in the Docker container when uploaded to ECR found at /opt/.sagemakerinternal/conda/bin/python.
This happens with Notebook and Console kernel sessions launched from the image.
Attempted debugging
I can confirm that the Notebook and Console sessions are launching the correct image because I can find my installed packages in /usr/local/lib/python3.8/site-packages from within the Notebook and the Console.
I can confirm this doesn't happen if I launch an Image Terminal from the Studio Session. When I do this, the Docker container is launched as expected and executing $ python points to the expected installation in /usr/local/bin
I think this has something to do with how Jupyter sessions are launched within SageMaker Studio. When I look at the logs on CloudWatch, I see this:

I have a few concerns:
- Why is SageMaker using conda here when my image doesn't have conda installed? Can I override this? I would have thought the main purpose of Custom images was to be able to use a different version of Python than standard SageMaker Images. Why override this when importing custom images?
- What does it mean that this warning is getting raised:
+ echo This is not a DLC/Studio image (found Custom), so not adding Python3 kernel.This should be a SageMaker image. I make it copying theecho_kernelexample Dockerfile from this repo. SageMaker has no issues launching the Notebook session from the console.
So I did find a workaround by including a custom kernelspec in the Docker container that specifies the system python installation at /usr/local/bin/python. The default just calls python from python which is calling the conda installation when loaded using sagemaker.
The new kernelspec looks like:
{
"argv": [
"/usr/local/bin/python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "Python 3",
"language": "python"
}
I wish this behavior was better documented in the SageMaker custom image tutorials.
I have this same issue, and happy to find a workaround, but this is quite annoying. Not sure why it's necessary for SageMaker to install conda when I'm using a custom image? :thinking:
Noticing similar issues. For me KernelGatewayApp picks the correct kernel(python3) but it also keeps complaining about conda :) as not found, and then I added nb_conda_kernels to make it happy and conda related errors went away. It appears that the KernelGatewayApp keeps looking for conda even though the custom image did not need it. I was able to make it working with including conda for env management and then adding my preferred package manager - poetry to pull the custom packages from AWS CodeArtifact. Bringing in Conda(unnecessary for my usecase, I only need poetry) causes other issues where I need to make sure all the packages are managed under conda environment. In order to make it working on Studio notebook, I have to put workarounds to make sure conda environment is activated on the remote kernel :(
I am using the poetry image, and am facing the similar issue, my image is not starting with file unhandled errror and here are my kernel logs:
My python is installed in:
/root/.pyenv/shims/python
and my kernel is installed in
/root/.pyenv/versions/3.7.12/share/jupyter/kernels/python3
and this is my kernel.json { "argv": [ "/root/.pyenv/versions/3.7.12/bin/python", "-m", "ipykernel_launcher", "-f", "{connection_file}" ], "display_name": "Python 3", "language": "python" }
Can some one tell how did you pint the system python installation at /usr/local/bin/python3 and how did you add the custom kernelspec? Would be great if you share the dockerfile?
I am using the poetry image, and am facing the similar issue, my image is not starting with file unhandled errror and here are my kernel logs:
![]()
My python is installed in:
/root/.pyenv/shims/pythonand my kernel is installed in
/root/.pyenv/versions/3.7.12/share/jupyter/kernels/python3and this is my kernel.json
{ "argv": [ "/root/.pyenv/versions/3.7.12/bin/python", "-m", "ipykernel_launcher", "-f", "{connection_file}" ], "display_name": "Python 3", "language": "python" }Can some one tell how did you pint the system python installation at /usr/local/bin/python3 and how did you add the custom kernelspec? Would be great if you share the dockerfile?
did you solve this problem? I have trapped by it since 3 weeks ago. I will be very grateful if you share your solution with me.
So I did find a workaround by including a custom kernelspec in the Docker container that specifies the system python installation at
/usr/local/bin/python. The default just calls python frompythonwhich is calling the conda installation when loaded using sagemaker.The new kernelspec looks like:
{ "argv": [ "/usr/local/bin/python", "-m", "ipykernel_launcher", "-f", "{connection_file}" ], "display_name": "Python 3", "language": "python" }I wish this behavior was better documented in the SageMaker custom image tutorials.
Could you share the dockerfile?