coral-pi-rest-server
coral-pi-rest-server copied to clipboard
Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
As title * https://www.tensorflow.org/tfx/serving/api_rest * https://cloud.google.com/community/tutorials/standalone-tensorflow-raspberry-pi
Use `if flask.request.method != "POST" or not flask.request.files.get("image"):`, do the negative. This saves indentation. Same for `if not predictions` https://refactoring.com/catalog/replaceNestedConditionalWithGuardClauses.html
This seems to be the most popular request, but requires some preprocessing of the images **References** - https://thedatamage.com/face-recognition-tensorflow-tutorial/ - https://machinelearningmastery.com/how-to-develop-a-face-recognition-system-using-facenet-in-keras-and-an-svm-classifier/ - https://coral.withgoogle.com/examples/detect-image/#perform-face-detection
I tried posting a completely blank (black) image to see if this would crash the server, but it returned the predictions from a previous image (containing people).
Can we support coral on mac via docker? Seems we should be able to, I'm not sure how we mount the USB however
Hi, thanks for your work, it works great with the [Home Assistant addon](https://github.com/grinco/HASS-coral-rest-api) by u/grinco serving Blue Iris in a different VM. I have a suggestion, though: can you also...
So I have been testing this out to see if I can get it to work for my security cameras. I took one of my alert images with a car...
https://docs.ultralytics.com/guides/coral-edge-tpu-on-raspberry-pi/#what-should-i-do-if-tensorflow-is-already-installed-on-my-raspberry-pi-but-i-want-to-use-tflite-runtime-instead