raspberry-pi-pcie-devices
raspberry-pi-pcie-devices copied to clipboard
Add Pineboards Hat AI! Dual Edge Coral TPU Bundle
Pineboards offers a Hat AI! Dual Edge Coral TPU Bundle for Raspberry Pi 5, which unites a Dual Edge Coral TPU for AI/ML/inference to the Raspberry Pi 5, through a PCIe Switch to support both PCIe lanes for both TPUs.
Most other E-key PCIe HATs only support one PCIe lane, so if you installed a Dual Edge TPU, you would only have access to one of the two TPUs.
Pineboards also includes a Dual Edge TPU with this bundle, so you don't need to source your own from another vendor.
See existing issue about Coral M.2 Dual Edge TPU support on Pi: https://github.com/geerlingguy/raspberry-pi-pcie-devices/issues/318
I'm getting one of these in the next few days. I'm excited to test if it's possible to host a local AI server using something like CodeProject.AI on a Pi5. I have a Blue Iris server running on my network and having a central location to send AI requests to (instead of having to use a power hungry GPU) would be fantastic. That paired with using one of the Wave Share POE hats would make it a super easy and compact solution....if it works ;)
I've been running this hat alongside a Wave Share POE hat for a few days now and it's been working without issue. I have a Blue Iris server running on a Win11 machine sending AI requests over network to it.
Previously I was running GPU AI detection, which takes a ton of power on an RTX 3060. I tried to move to the Coral dual edge TPU using a PCIe adapter ( https://www.makerfabs.com/dual-edge-tpu-adapter.html ) but Windows was never fully happy with it and it often crashed, reverting back to CPU/GPU AI detection and again using a ton of power.
Here's a quick setup guide if anyone wants to try to set up CodeProject.AI with this hat and a Coral Dual Edge TPU.
Notes: This guide is for the rpi64 version which is mostly limited to just the Coral TPU. If you would like to use more modules or have more customization, simply replace "rpi64" with "arm64" in the docker download/start lines.
A lot of the guide comes condensed from PineBoards' and CodeProject.AI's instructions. (And thanks to Jeff for the PCIe Gen 3.0 code ;) )
Start with a fresh install of PiOS Lite 64-bit and connect using your SSH program of choice.
First boot update:
sudo apt update && sudo apt upgrade -y
Update the kernel to the latest version:
sudo rpi-update
Install docker:
sudo curl -sSL https://get.docker.com | sh
Add your user to the docker group:
sudo usermod -aG docker $USER
Open the Pi's config file in Nano:
sudo nano /boot/firmware/config.txt
Add the following lines to the bottom of the file:
#Enable the PCIe External connector.
dtparam=pciex1
kernel=kernel8.img
#Enable Pineboards Hat Ai
dtoverlay=pineboards-hat-ai
#Upgrade to PCI Gen 3.0
dtparam=pciex1
dtparam=pciex1_gen=3
Save and close the file by pressing CTRL+X, Y to confirm, and Enter to exit.
Reboot the Pi:
sudo reboot
Install rpi-source to Fetch Kernel Headers, then fetch the Kernel Headers:
sudo apt install git bc bison flex libssl-dev make libncurses5-dev && sudo wget https://raw.githubusercontent.com/jgartrel/rpi-source/master/rpi-source -O /usr/bin/rpi-source && sudo chmod +x /usr/bin/rpi-source && rpi-source --tag-update && rpi-source --default-config
Add the Google Coral Edge TPU package repository and import the GPG key:
echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | sudo tee /etc/apt/sources.list.d/coral-edgetpu.list && curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add -
Update your package list:
sudo apt-get update
Install the necessary packages:
sudo apt-get install cmake libedgetpu1-std devscripts debhelper dkms dh-dkms
Clone the Gasket Driver repo:
git clone https://github.com/google/gasket-driver.git
Change into the directory and build the driver:
cd gasket-driver && sudo debuild -us -uc -tc -b
Go back to the parent directory and install the built package:
cd .. && sudo dpkg -i gasket-dkms_1.0-18_all.deb
Add a udev rule to manage device permissions:
sudo sh -c "echo 'SUBSYSTEM==\"apex\", MODE=\"0660\", GROUP=\"apex\"' >> /etc/udev/rules.d/65-apex.rules"
Create a new group and add your user to it:
sudo groupadd apex && sudo adduser $USER apex
Reboot your Pi:
sudo reboot
Download the latest version of Codeproject.AI server: RPi64 version
docker pull codeproject/ai-server:rpi64
Start the docker container and set it to run on boot
docker run --restart=always --name CodeProject.AI -d -p 32168:32168 \
--privileged -v /dev/bus/usb:/dev/bus/usb codeproject/ai-server:rpi64
Open the web interface of the newly set up CodeProject.AI server:
http://piaddress:32168
You should now show "Started Multi-TPU (TF-Lite)" and can send AI requests
@MidnightLink awesome work! That's great to see it's been validated on the Pi 5. I have also broken mine out earlier today, and it was able to recognize both TPUs, so I think we can mark this as fixed/working, and if anyone has issues or further questions, feel free to add them here!
@MidnightLink - I just ran through your instructions on a fresh Pi OS 12 install, and it worked like a charm, thanks!
Follow-up question: Any idea if the CodeProject.AI Server will be adding support for the Hailo-8 / Hailo-8L?
@geerlingguy Funny enough I actually just posted in their forum asking the same exact question after seeing your latest video :) They do support adding third party modules already, but I'm hoping that they'll be able to get something integrated natively soon since I also just ordered one of the new Hailo Pi kits. There's a pretty detailed write up listed in their docs on how to do so ( https://www.codeproject.com/ai/docs/devguide/module_examples/adding_new_modules.html ) but I haven't really had the need to add anything as of yet
Running dual edge TPU and one USB TPU with the PI 5 with 9 camera streams. However no hardware accelaration is working due to depriciated support. Anyone had any success with hardware accelaration?
When i using this way to install the pineboard hat i can t use the ALPR anymore.
i get this errors:
21:24:53: 21:24:53:Started License Plate Reader module 21:24:53:ALPR_adapter.py: Traceback (most recent call last): 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/ALPR_adapter.py", line 11, in 21:24:53:ALPR_adapter.py: from ALPR import init_detect_platenumber, detect_platenumber 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/ALPR.py", line 17, in 21:24:53:ALPR_adapter.py: from paddleocr import PaddleOCR 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/paddleocr/__init__.py", line 14, in 21:24:53:ALPR_adapter.py: from .paddleocr import * 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/paddleocr/paddleocr.py", line 21, in 21:24:53:ALPR_adapter.py: import paddle 21:24:53:ALPR_adapter.py: ModuleNotFoundError: No module named 'paddle'
does someone has similar problems?
Thanks for all the advice, @geerlingguy & @MidnightLink! I was able to get my Coral TPU running just fine! Although I'm not so sure about api support and kinda disappointed about the fact that I could not replicate this in my own code, I could not have done this without your instructions! Thanks A lot!
When i using this way to install the pineboard hat i can t use the ALPR anymore. i get this errors:
21:24:53: 21:24:53:Started License Plate Reader module 21:24:53:ALPR_adapter.py: Traceback (most recent call last): 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/ALPR_adapter.py", line 11, in 21:24:53:ALPR_adapter.py: from ALPR import init_detect_platenumber, detect_platenumber 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/ALPR.py", line 17, in 21:24:53:ALPR_adapter.py: from paddleocr import PaddleOCR 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/paddleocr/__init__.py", line 14, in 21:24:53:ALPR_adapter.py: from .paddleocr import * 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/paddleocr/paddleocr.py", line 21, in 21:24:53:ALPR_adapter.py: import paddle 21:24:53:ALPR_adapter.py: ModuleNotFoundError: No module named 'paddle'does someone has similar problems?
Are you using the one for the Rockchip NPUs? If so, i'm not sure it will work!
When i using this way to install the pineboard hat i can t use the ALPR anymore. i get this errors:
21:24:53: 21:24:53:Started License Plate Reader module 21:24:53:ALPR_adapter.py: Traceback (most recent call last): 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/ALPR_adapter.py", line 11, in 21:24:53:ALPR_adapter.py: from ALPR import init_detect_platenumber, detect_platenumber 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/ALPR.py", line 17, in 21:24:53:ALPR_adapter.py: from paddleocr import PaddleOCR 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/paddleocr/__init__.py", line 14, in 21:24:53:ALPR_adapter.py: from .paddleocr import * 21:24:53:ALPR_adapter.py: File "/app/modules/ALPR/bin/linux/python38/venv/lib/python3.8/site-packages/paddleocr/paddleocr.py", line 21, in 21:24:53:ALPR_adapter.py: import paddle 21:24:53:ALPR_adapter.py: ModuleNotFoundError: No module named 'paddle'does someone has similar problems?
I think I found your issue! https://github.com/codeproject/CodeProject.AI-ALPR/issues/3
i updated to codeproject 2.9.7 no i get these errors:
20:42:03:CPAI_CORAL_MULTI_TPU = true
20:42:03:MODELS_DIR =
the system starts and it shows multi-tpu. but as soon as a oblect is detected it switches back to cpu what can i do to fix it.
i fixed it. the problem was the version 2.9.5. i removed everything and startet wie a clean inistallation. i used now version 2.6.5 and everything ist fine
