gst-inference
gst-inference copied to clipboard
Custom batch size and multiple stream input.
Hi, how can I change batch sizes and number of streams to be added for inference? Please let me know if there is a way to customize it. What is the current batch size for inference? Thanks!
Hi, the only supported batch size at the moment is one. We can add support for configurable batch size, but it is not currently a priority. The only way to process multiple streams right now is by adding multiple elements to the pipeline.