sklearn-onnx
sklearn-onnx copied to clipboard
Can't create an onnx model with OnnxMatMulInteger
I'm trying to create an ONNX model that uses MatMulInteger. However, when I do, I get an "Unsupported data_Type 'int8'" error. Perhaps I'm not creating things correctly and would appreciate if someone could help me identify what I'm doing wrong.
I've set the output to an int32 tensor because the onnx operators doc states that it can overflow into int32 and there was not int8 tensor type function.
Here's the code I'm trying to run:
from skl2onnx.algebra.onnx_ops import OnnxMatMulInteger
from skl2onnx.common.data_types import Int32TensorType
import numpy as np
op = OnnxMatMulInteger('X', 'Y')
op = OnnxMatMulInteger('X', 'Y', output_names='Z')
X = np.ones((320,320), dtype=np.int8)
Y = np.ones((320), dtype=np.int8)
onx = op.to_onnx({'X':X, 'Y':Y}, outputs=[('Z', Int32TensorType())])
with open('int32matmul.onnx', 'wb') as f:
f.write(onx.SerializeToString())
int8 is not very common in standard machine learning, there is nothing preventing us from doing that except the converting library does not have a type Int8TensorType. It needs to be added to make your code work.
Support for Int8 was added in the latest release. Could you try again?
The following code works now.
from skl2onnx.algebra.onnx_ops import OnnxMatMulInteger
from skl2onnx.common.data_types import Int8TensorType
import numpy as np
op = OnnxMatMulInteger('X', 'Y')
op = OnnxMatMulInteger('X', 'Y', output_names='Z')
X = np.ones((320,320), dtype=np.int8)
Y = np.ones((320), dtype=np.int8)
onx = op.to_onnx({'X':X, 'Y':Y}, outputs=[('Z', Int8TensorType())])
with open('int32matmul.onnx', 'wb') as f:
f.write(onx.SerializeToString())