BentoML
BentoML copied to clipboard
feature: ONNX service with multiple inputs
Feature request
I exported an onnx model that accept multiple inputs ("input_ids", "input_mask", "input_seg").
And the docs for bentoml(ONNX) only give a simple example, runner.run.run(test_input)
.
However, an error ouccred after i wrapped these inputs with dict:
TypeError: run of ONNXRunnable only takes numpy.ndarray or pd.DataFrame, tf.Tensor, or torch.Tensor as input parameters
I wonder if i use it in a wrong way?
Motivation
I think ONNX is useful for AI inference so can you improve the docs about using bentoml on onnx service?
Other
No response