gym-duckietown icon indicating copy to clipboard operation
gym-duckietown copied to clipboard

Install CUDA, PyTorch or any DNN Library on the duckiebot itself?

Open bishoyroufael opened this issue 2 years ago • 2 comments

I have a DB21M assembled, getting confused about how to run neural networks on the bot itself. I tried SSHing into the bot but didn't find any NVIDIA drivers installed or CUDA installed? Is there a way to do that easily? Can't find anything useful in the docs talking about this.

From the NVIDIA docs, there should be a way to install JetPack and get some neural networks running out from their examples presented here. I want to use that with ROS on the duckiebot

bishoyroufael avatar Apr 24 '22 02:04 bishoyroufael

Can this please be explained @tanij , this understanding would help allot.

FelixMildon avatar May 18 '22 19:05 FelixMildon

After going through a lot of research and pain. I was able to make something work here as part of my thesis project. Feel free to use the Dockerfile in your own project.

It uses dustynv/jetson-inference as a base container which is basically a container having CUDA, PyTorch and some cool DNN models there to be used straight away. It also has ROS melodic on top of that where you can set ROS_MASTER_URI with the duckiebot IP for communication to work.

If you just want PyTorch you can create rather derive from l4t-pytorch. More details about that here.

bishoyroufael avatar May 18 '22 20:05 bishoyroufael