ONE
ONE copied to clipboard
[onert] Support multi-batch input inference for single batch model
Currently we are supporting multi-batch input inference for single batch model on shape inference routine. This feature is used for trix backend multi-batch inference and training input.
But at first, shape inference was designed for fixed batch size with changed width and height. It makes unexpected shape inference result on some operator (ex. reshape).
To support multi-batch input for single batch model, we need to revise shape inference or introduce new feature.