balena-intel
balena-intel copied to clipboard
Release nvidia GPU drivers as a hostapp extension
Customers with Nvidia GPUs are having trouble building the drivers as external modules. We should prepare a hostapp not only with the drivers but also the required libraries to use the GPU for CUDA etc.
This depends on the work on hostapp extension support to be completed
Just wanted to keep this issue tracked.
Any idea how long it would take to fix this? :+1:
hi @openedhardware we are currently working on this same support for the Nvidia Jetson device types. Once that is done we will start working on this.
Hi, @alexgg Any update here?
hi @openedhardware unfortunately the scope of this work keeps growing and we don't have a clear ETA at the moment.
[pdcastro] This issue has attached support thread https://jel.ly.fish/251a8861-56c8-4cc0-9a9c-e54babb01861
Hello @alexgg, is there any update on this?
I am pulling FROM nvcr.io/nvidia/l4t-pytorch:r32.5.0-pth1.7-py3
in my dockerfile to use GPU based Pytorch required for my model that I need to run on Nvidia Jetson Nano SD. However, I cannot run my model over Balena although my model can run on the device directly on its own OS without issue. That is the error log I can see Balena Dashboard showing my device:
raceback (most recent call last): File "/app/api.py", line 6, in <module> import torch File "/usr/local/lib/python3.6/dist-packages/torch/__init__.py", line 189, in <module> _load_global_deps() File "/usr/local/lib/python3.6/dist-packages/torch/__init__.py", line 142, in _load_global_deps ctypes.CDLL(lib_path, mode=ctypes.RTLD_GLOBAL) File "/usr/lib/python3.6/ctypes/__init__.py", line 348, in __init__ self._handle = _dlopen(self._name, mode) OSError: libcurand.so.10: cannot open shared object file: No such file or directory Service exited 'sha256:f1efbdcc9730f1324e3b56834ed02db0e1e52f91e9d99eda26671d3d3d606ccd'
I guess that that issue is based on the issue reported here. What do you guys think?
I would too like to know if any work has gone into this lately? Would be great to have!