gst-inference
gst-inference copied to clipboard
A GStreamer Deep Learning Inference Framework
add another caps (width, height) to mobilenetv2ssd element
I try to build GstInference on the Coreal Dev Kit. I followed the instructions on the wiki, build and installed R2Inference successfully. After I cloned the repo and run the...
I am a bit concerned about the use of the GstInference methods. If you check other elements, you may check that GstInference provide functions for printing, building the metadata blocks,...
Hi all, I want to know If we write the inference plugin for deep learning models with python or c++, what's difference in the speed up? All we know, the...
Hi everyone, Suppose I want to use python inference for a deep model and do inference in plugin, one solution is to copy all regular inference python codes in do_transform_ip...
Any other [hardware option ](https://github.com/microsoft/onnxruntime/blob/master/BUILD.md#openvino) for OpenVINO backend is not working with current models provided by RIdgerun. Here is a sample pipeline to reproduce the issue: ``` gst-launch-1.0 filesrc location=Test_benchmark_video.mp4...
- [x] Intel OpenVINO - [x] Intel Movidius - Myriad - [x] Arm Compute Library - [ ] Google Coral - EdgeTPU - [x] TensorFlow - [x] TFLite