gst-inference icon indicating copy to clipboard operation
gst-inference copied to clipboard

What's difference between python and c++ plugin in the infernece

Open PythonImageDeveloper opened this issue 5 years ago • 2 comments

Hi all, I want to know If we write the inference plugin for deep learning models with python or c++, what's difference in the speed up? All we know, the c/c++ language has more speed than python, and this is right when we want to implement the algorithms from scratch, because we want to impelement inference codes of model that has c/c++ backend but python interface, Writing python gstreamer plugin but running the main codes with c/c++ backend(like tensorflow), How different are their speeds(c/c++ plugin and python plugin)?

PythonImageDeveloper avatar Sep 20 '20 13:09 PythonImageDeveloper

Hi @PythonImageDeveloper. I do not have concrete numbers to give you, but I agree with your statement: if the algorithms are written in C/C++ underneath, having a python interface shouldn't be a significant bottleneck. This will be true as long as you can transparently share memory between C/C++ and Python. As a reference, we have some elements written in Python that perform inference underneath (PyTorch in our case) and it works fine.

In the specific case of GstInference, we prefer to stick to a lower level for portability purposes. It is easy for a user to write an app in Python, Rust, C#, if we are in C/C++. It is not true the other way around.

michaelgruner avatar Sep 21 '20 15:09 michaelgruner

Hi @michaelgruner , Thanks your suggestion.

PythonImageDeveloper avatar Oct 12 '20 08:10 PythonImageDeveloper