Question on GPU
Hello , First thank you for this great project ! I would like to know please if tensorflow use the GPu on the raspberry pi ? Thanks you
Thanks for the question- unfortunately, at this time TensorFlow isn't compatible with the GPU on the Raspberry Pi, as TensorFlow only supports NVIDIA CUDA graphics cards.
One day (hopefully), a group of people will find a reasonable way to map OpenCL/CUDA calls onto the RPi GPU!
In the meantime, if you'd like to create custom deep learning code for the Raspberry Pi's GPU, a good place to get started is this post by Pete Warden.
Presumably XLA is part of the answer.
https://twitter.com/danbri/status/839431030201266177 -> https://developers.googleblog.com/2017/03/xla-tensorflow-compiled.html
XLA (Accelerated Linear Algebra), a compiler for TensorFlow. XLA uses JIT compilation techniques to analyze the TensorFlow graph created by the user at runtime, specialize it for the actual runtime dimensions and types, fuse multiple ops together and emit efficient native machine code for them - for devices like CPUs, GPUs and custom accelerators (e.g. Google’s TPU).
-> https://www.tensorflow.org/performance/xla/
That's a pretty fun idea! I can imagine it being a good exercise, and useful if it's implemented correctly. For me personally, it'll have to be a back-burner idea, but I'll leave some links that might be useful moving forward with that idea:
- Mirror of the official Videocore IV architecture reference guide
- Unofficial Broadcom Videocore IV documentation and samples
- More unofficial samples and documentation
- Implementation of GEMM matrix multiplication on RPI
- Assembler/disassembler for the RPi QPU
- First of blog posts by Pete Warden on deep learning on RPI
- Second (and lengthier) post by Pete Warden
I'll reopen this thread to increase discoverability.
Hate to see Nvidia having a monopoly on TF GPU - would like explore helping break that - any links, resources appreciated
@samjabrahams py-videocore is also an interesting RPi GPGPU library supporting Python.