server icon indicating copy to clipboard operation
server copied to clipboard

HELP: Running on JetPack 6.2

Open turowicz opened this issue 4 months ago • 7 comments

My use case is to run tensorrt compiled YOLOv8 models (ultralytics+onnx+onnxslim) on Server GPUs (Ubuntu Server 24.04) and Jetson GPUS (JetPack 6.2.1). While for servers we are simply using nvcr.io/nvidia/tritonserver:25.01-py3, it doesn't work on Jetson but no gpu is detected. I have tried older tags all the way down to 24.01-py3 with and without -igpu suffixes. Now the gpu works on Jetson with nvcr.io/nvidia/l4t-jetpack:r36.4.0 but the tritonserver tarball from release notes doesn't run on it. I have also tried older versions but no luck there seems be always a problem with compatibility.

Clearly we are doing something that jetson is prescribed for: Edge video processing with triton as part of a k8s cluster, where Jetsons are nodes with agent role and just the API for inference. We've had no luck with docker, containerd with or without k8s. We are using k3s, but I don't think this is relevant at all.

Can someone please tell me how are we supposed to do it with JP 6.2.1?

Which versions of the tarball are meant for which jetpack?

turowicz avatar Aug 28 '25 06:08 turowicz

cc @deadeyegoodwin @eshcheglov @CoderHam similar to: https://github.com/triton-inference-server/server/issues/2361, https://github.com/triton-inference-server/server/issues/1468, https://github.com/triton-inference-server/server/issues/8183,

turowicz avatar Aug 28 '25 06:08 turowicz

JP 6.2.1 + nvcr.io/nvidia/l4t-jetpack:r36.4.0 + tritonserver2.52.0-igpu.tar crash with:

tritonserver: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.36' not found (required by tritonserver)
tritonserver: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.38' not found (required by tritonserver)
tritonserver: /usr/lib/aarch64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.32' not found (required by tritonserver)
tritonserver: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.38' not found (required by /opt/tritonserver/lib/libtritonserver.so)
tritonserver: /usr/lib/aarch64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.32' not found (required by /opt/tritonserver/lib/libtritonserver.so)

turowicz avatar Aug 28 '25 07:08 turowicz

Here's what ChatGPT 5.0 thinks:

Image

turowicz avatar Aug 28 '25 07:08 turowicz

OK seems like the last precompiled Triton release for Jetpack was for 23.06, which is 2 years behind current.

It looks like NVIDIA has abandoned any and all support of developers who are deploying their devices to clients for edge AI processing.

turowicz avatar Sep 03 '25 09:09 turowicz

It looks like NVIDIA has abandoned any and all support of developers who are deploying their devices to clients for edge AI processing.

Yep.

frieddeeu avatar Oct 02 '25 16:10 frieddeeu

I have the same driver problem. it happened when I upgrade to jetpack 6.2.1 on jetson agx orin.

NVIDIA Release 24.08 (build 107631420) Triton Server Version 2.49.0

ERROR: This container was built for NVIDIA Driver Release 560.35 or later, but version 540.4.0 was detected and compatibility mode is UNAVAILABLE.

   [[]]

Have you solved the problem?

sdurmustalipoglu1 avatar Nov 23 '25 14:11 sdurmustalipoglu1

The solution was to use the following versions:

Triton as an API:

  • arm64: FROM nvcr.io/nvidia/tritonserver:25.01-py3
  • amd64: FROM nvcr.io/nvidia/tritonserver:25.01-py3-igpu

Model Converter as an InitContainer:

  • arm64: FROM nvcr.io/nvidia/tensorrt:25.01-py3
  • amd64: FROM nvcr.io/nvidia/tensorrt:25.01-py3-igpu

Inside the dockerfiles make sure to use the matching python package registry for gpu python libs:

For 25.01 its: https://pypi.jetson-ai-lab.io/jp6/cu128

turowicz avatar Nov 24 '25 09:11 turowicz