jetson-containers
jetson-containers copied to clipboard
ImportError: /usr/lib/aarch64-linux-gnu/libcublas.so.10: file too short
Jetpack version: 5.2 Board: Jetson NX 4.9.201-tegra
Errors:
nvidia@nvidia-desktop:~$ docker run --gpus all -it dustynv/ros:foxy-ros-base-l4t-r32.5.0
sourcing /opt/ros/foxy/install/setup.bash
ROS_ROOT /opt/ros/foxy
ROS_DISTRO foxy
root@xxxx:/# python3 -c "import cv2"
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/local/lib/python3.6/dist-packages/cv2/__init__.py", line 96, in <module>
bootstrap()
File "/usr/local/lib/python3.6/dist-packages/cv2/__init__.py", line 86, in bootstrap
import cv2
ImportError: /usr/lib/aarch64-linux-gnu/libcublas.so.10: file too short
root@xxxx:/#
I pulled the dustynv/ros:foxy-ros-base-l4t-r32.5.0
image, and try to manually check if it's installed correctly. OpenCV library seems to be installed incorrectly, or the package wasn't linked? Any helps would be appericated.
Also, I tried to build an image locally using the provided script, similar error related to Cuda happened when it tried to test the openCV (GPU enabled) build.
nvidia@nvidia-desktop:~/jetson-containers$ ./scripts/docker_test_ros.sh foxy
reading L4T version from /etc/nv_tegra_release
L4T BSP Version: L4T R32.5.2
l4t-base image: nvcr.io/nvidia/l4t-base:r32.5.0
testing container ros:foxy-ros-base-l4t-r32.5.2 => ros_version
[sudo] password for nvidia:
localuser:root being added to access control list
xhost: must be on local machine to add or remove hosts.
xauth: file /tmp/.docker.xauth does not exist
sourcing /opt/ros/foxy/install/setup.bash
ROS_ROOT /opt/ros/foxy
ROS_DISTRO foxy
getting ROS version -
foxy
done testing container ros:foxy-ros-base-l4t-r32.5.2 => ros_version
testing container ros:foxy-ros-base-l4t-r32.5.2 => OpenCV
localuser:root being added to access control list
xhost: must be on local machine to add or remove hosts.
sourcing /opt/ros/foxy/install/setup.bash
ROS_ROOT /opt/ros/foxy
ROS_DISTRO foxy
testing OpenCV...
Traceback (most recent call last):
File "test/test_opencv.py", line 4, in <module>
import cv2
File "/usr/local/lib/python3.6/dist-packages/cv2/__init__.py", line 96, in <module>
bootstrap()
File "/usr/local/lib/python3.6/dist-packages/cv2/__init__.py", line 86, in bootstrap
import cv2
ImportError: libcublas.so.10: cannot open shared object file: No such file or directory
Hi @Rainerino, do you have JetPack/CUDA/cuDNN/ect installed on your Jetson? How did you install Docker and the NVIDIA Container Runtime? It should have come with JetPack.
Do you have these files?
$ ls /etc/nvidia-container-runtime/host-files-for-container.d/
cuda.csv cudnn.csv l4t.csv opencv.csv tensorrt.csv visionworks.csv
On Jetson, currently CUDA/ect gets mounted into the containers at runtime. It appears you are either missing libcublas from your Jetson, or the CSV mounting files are missing.
@dusty-nv Thank you for your timely reply :)
(I am really new to the Jetson) I was given the Jetson in the lab, and I believe it was preinstalled from the SDK manager. There is only nvcc
for some reason, but no nvidia-smi
. The docker came with the installation, and I installed nvidia-docker
according to this guide: https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html (because I wasn't sure if it was already installed or not, maybe it was). There is cuda 10.2, no Cudnn.
nvidia@nvidia-desktop:~$ nvcc -V
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2019 NVIDIA Corporation
Built on Wed_Oct_23_21:14:42_PDT_2019
Cuda compilation tools, release 10.2, V10.2.89
nvidia@nvidia-desktop:~$ ls /etc/nvidia-container-runtime/host-files-for-container.d/
l4t.csv
nvidia@nvidia-desktop:~$ ls /usr/local/cuda
cuda/ cuda-10.2/
So I guess the jetson installation was faulty somehow, and a lot of stuff is missing. I will try to reinstall everything now and get back on a fresh installation. I assume that with a correct installation, cuda.csv cudnn.csv l4t.csv opencv.csv tensorrt.csv visionworks.csv
will be in said location?
There is only
nvcc
for some reason, but nonvidia-smi
Hi @Rainerino, it's normal for there to be no nvidia-smi
, as nvidia-smi
isn't supported on Jetson. But yea, you want to get all the JetPack components working on your device so you can use the container properly.
You can try doing a sudo apt-get install nvidia-jetpack
and see if that helps things. If not, then I would just re-flash it with the latest JetPack and get everything installed in working order.
Hi, the OpenCV is already installed on my device, but when I build the container, it throws error:
arun@arun:~/Documents/jetson-inference$ jetson_release
- NVIDIA Jetson Xavier NX (Developer Kit Version)
* Jetpack 4.6 [L4T 32.6.1]
* NV Power Mode: MODE_20W_6CORE - Type: 8
* jetson_stats.service: active
- Libraries:
* CUDA: 10.2.300
* cuDNN: 8.2.1.32
* TensorRT: 8.0.1.6
* Visionworks: 1.6.0.501
* OpenCV: 4.6.0 compiled CUDA: YES
* VPI: ii libnvvpi1 1.1.15 arm64 NVIDIA Vision Programming Interface library
* Vulkan: 1.2.70
arun@arun:~/Documents/jetson-inference$
+ echo 'testing cv2 module under python...'
testing cv2 module under python...
+ python3 -c 'import cv2; print('\''OpenCV version:'\'', str(cv2.__version__)); print(cv2.getBuildInformation())'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/local/lib/python3.6/dist-packages/cv2/__init__.py", line 96, in <module>
bootstrap()
File "/usr/local/lib/python3.6/dist-packages/cv2/__init__.py", line 86, in bootstrap
import cv2
ImportError: libcublas.so.10: cannot open shared object file: No such file or directory
When I do locate of cuda so files, I am able to find them on device:
arun@arun:~/Documents/jetson-inference$ locate libcublas.so.10
/usr/local/cuda-10.2/targets/aarch64-linux/lib/libcublas.so.10
/usr/local/cuda-10.2/targets/aarch64-linux/lib/libcublas.so.10.2.3.300
arun@arun:~/Documents/jetson-inference$
Could you please help how the docker shall build with system OpenCV
thanks Kind Regards Arun
How are you running the container? you need to use --runtime nvidia
if you aren't already (the scripts/docker_run.sh
script from this repo already does that)
Sorry, I destroyed the setup. Need time to replicate it, please close as of now..