ONE icon indicating copy to clipboard operation
ONE copied to clipboard

[onert] Support multi-batch input inference for single batch model

Open hseok-oh opened this issue 10 months ago • 2 comments

Currently we are supporting multi-batch input inference for single batch model on shape inference routine. This feature is used for trix backend multi-batch inference and training input.

But at first, shape inference was designed for fixed batch size with changed width and height. It makes unexpected shape inference result on some operator (ex. reshape).

To support multi-batch input for single batch model, we need to revise shape inference or introduce new feature.

hseok-oh avatar Apr 11 '24 08:04 hseok-oh