sklearn-onnx icon indicating copy to clipboard operation
sklearn-onnx copied to clipboard

Can't create an onnx model with OnnxMatMulInteger

Open lszinv opened this issue 4 years ago • 3 comments

I'm trying to create an ONNX model that uses MatMulInteger. However, when I do, I get an "Unsupported data_Type 'int8'" error. Perhaps I'm not creating things correctly and would appreciate if someone could help me identify what I'm doing wrong.

I've set the output to an int32 tensor because the onnx operators doc states that it can overflow into int32 and there was not int8 tensor type function.

Here's the code I'm trying to run:

from skl2onnx.algebra.onnx_ops import OnnxMatMulInteger
from skl2onnx.common.data_types import Int32TensorType
import numpy as np

op = OnnxMatMulInteger('X', 'Y')
op = OnnxMatMulInteger('X', 'Y', output_names='Z')
X = np.ones((320,320), dtype=np.int8)
Y = np.ones((320), dtype=np.int8)
onx = op.to_onnx({'X':X, 'Y':Y}, outputs=[('Z', Int32TensorType())])
with open('int32matmul.onnx', 'wb') as f:
    f.write(onx.SerializeToString())

lszinv avatar Aug 12 '20 07:08 lszinv

int8 is not very common in standard machine learning, there is nothing preventing us from doing that except the converting library does not have a type Int8TensorType. It needs to be added to make your code work.

xadupre avatar Aug 21 '20 16:08 xadupre

Support for Int8 was added in the latest release. Could you try again?

xadupre avatar Aug 19 '21 12:08 xadupre

The following code works now.

from skl2onnx.algebra.onnx_ops import OnnxMatMulInteger
from skl2onnx.common.data_types import Int8TensorType
import numpy as np

op = OnnxMatMulInteger('X', 'Y')
op = OnnxMatMulInteger('X', 'Y', output_names='Z')
X = np.ones((320,320), dtype=np.int8)
Y = np.ones((320), dtype=np.int8)
onx = op.to_onnx({'X':X, 'Y':Y}, outputs=[('Z', Int8TensorType())])
with open('int32matmul.onnx', 'wb') as f:
    f.write(onx.SerializeToString())

xadupre avatar Aug 23 '21 16:08 xadupre