onnx-mlir
onnx-mlir copied to clipboard
MLPERF 0.7 onnx models support status
Source: https://github.com/mlperf/inference/tree/r0.7#mlperf-inference-v07-submission-9182020
I tried to compile the following onnx models. All of them are failed.
model | onnx ready | dataset | onnx-mlir |
---|---|---|---|
resnet50-v1.5 | Yes | imagenet2012 | Failed |
ssd-mobilenet 300x300 | Yes | coco resized to 300x300 | Failed |
ssd-resnet34 1200x1200 | Yes | coco resized to 1200x1200 | Failed |
bert | Yes | squad-1.1 | Failed |
3d-unet | Yes | BraTS 2019 | Failed |
Is there any experience or workaround to tune tf2onnx?
Error output: mlperf_models_output 20201015.pdf
-
ONNNX File: ./mlperf_models/224_224_160.onnx
- Group: Expected Error - Source: Shape inference failed - Category: Operations couldn't be inferred - Reason: error: Shape inference failed, 75 operations couldn't be inferred - Return Code:1
-
ONNNX File: ./mlperf_models/224_224_160_dyanmic_bs.onnx
- Group: Expected Error - Source: Shape inference failed - Category: Operations couldn't be inferred - Reason: error: Shape inference failed, 75 operations couldn't be inferred - Return Code:1
-
ONNNX File: ./mlperf_models/bert_large_v1_1_fake_quant.onnx
- Group: mlir Failure - Source: Attributes.cpp - Category: Attributes.cpp 'isValidIntOrFloat' assertion failure - Reason: onnx-mlir: /build/llvm-project/mlir/lib/IR/Attributes.cpp:1113: static mlir::DenseElementsAttr mlir::DenseIntOrFPElementsAttr::getRawIntOrFloat(mlir::ShapedType, llvm::ArrayRef<char>, int64_t, bool, bool): Assertion `::isValidIntOrFloat(type.getElementType(), dataEltSize, isInt, isSigned)' failed. - Return Code:1
-
ONNNX File: ./mlperf_models/resnet34-ssd1200.onnx
- Group: mlir Failure - Source: ConstProp.cpp - Category: ConstProp.cpp Assertion rhsFreeRank == 0 && expect both to recurse to zero at the same time failed. - Reason: onnx-mlir: /build/onnx-mlir/src/Transform/ONNX/ConstProp.cpp:162: void {anonymous}::RecurseConstPropElementwiseBinary(mlir::PatternRewriter&, std::vector<mlir::Attribute, std::allocator<mlir::Attribute> >&, mlir::DenseElementsAttr, mlir::DenseElementsAttr, llvm::SmallVector<long unsigned int, 4>&, llvm::SmallVector<long unsigned int, 4>&, int, int) [with ElementwiseBinaryOp = mlir::ONNXMulOp]: Assertion `rhsFreeRank == 0 && "expect both to recurse to zero at the same time"' failed. - Return Code:1
-
ONNNX File: ./mlperf_models/resnet50_v1.onnx
- Group: Expected Error - Source: Shape inference failed - Category: Operations couldn't be inferred - Reason: error: Shape inference failed, 2 operations couldn't be inferred - Return Code:1
-
ONNNX File: ./mlperf_models/ssd_mobilenet_v1_coco_2018_01_28.onnx
- Group: Others - Source: FrontendDialectHelper.cpp - Category: UNREACHABLE executed at - Reason: UNREACHABLE executed at /build/onnx-mlir/src/Builder/FrontendDialectHelper.cpp:215! - Return Code:1
The error report showed that two types of error for shape inference: not implemented or unranked tensor. Any way to know which operations the error occurred? Perhaps we should change the code to include operation name in the error message. We are adding support for dynamic tensor (like <?xi32> but no plan to support unranked tensor (like <*xi32>) yet. I hope that all the unranked tensors will disappear after the missing shape inferences are added.
The error report showed that two types of error for shape inference: not implemented or unranked tensor. Any way to know which operations the error occurred? Perhaps we should change the code to include operation name in the error message.
It will be very nice to have verbose error information to dianose these compilation error. One of the manual way I used is gdb and vscode debug to get the variables from stack.
We are adding support for dynamic tensor (like <?xi32> but no plan to support unranked tensor (like <*xi32>) yet. I hope that all the unranked tensors will disappear after the missing shape inferences are added.
Cool.
@chenqiny My output already has the operation information in the next line of the error message if emitError is used. Is that the case for your output? The emitError is a member of operation.
@chentong319 I attached a file in the issue with complete error output. The error output is just too long.
And the following is the error output I got from docker image updated in 10.18. (image creation:2020-10-15T16:18:22.789492633Z)
Are you using the head of git repository?
- ONNNX File: ./mlperf_models/224_224_160.onnx
- Group: Expected Error
- Source: Shape inference failed
- Category: Operations couldn't be inferred
- Reason: error: Shape inference failed, 75 operations couldn't be inferred
- Return Code:1
- Error output:
error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Shape inference failed, 75 operations couldn't be inferred
- ONNNX File: ./mlperf_models/224_224_160_dyanmic_bs.onnx - Group: Expected Error - Source: Shape inference failed - Category: Operations couldn't be inferred - Reason: error: Shape inference failed, 75 operations couldn't be inferred - Return Code:1 - Error output:
error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: Input tensor(s) not ranked error: shape inference failed error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: unable to infer shape of operation without shape inference interface error: Input tensor not ranked error: shape inference failed error: Shape inference failed, 75 operations couldn't be inferred
- ONNNX File: ./mlperf_models/bert_large_v1_1_fake_quant.onnx
- Group: mlir Failure
- Source: Attributes.cpp
- Category: Attributes.cpp 'isValidIntOrFloat' assertion failure
- Reason: onnx-mlir: /build/llvm-project/mlir/lib/IR/Attributes.cpp:1113: static mlir::DenseElementsAttr mlir::DenseIntOrFPElementsAttr::getRawIntOrFloat(mlir::ShapedType, llvm::ArrayRef
onnx-mlir: /build/llvm-project/mlir/lib/IR/Attributes.cpp:1113: static mlir::DenseElementsAttr mlir::DenseIntOrFPElementsAttr::getRawIntOrFloat(mlir::ShapedType, llvm::ArrayRef
- ONNNX File: ./mlperf_models/resnet34-ssd1200.onnx - Group: mlir Failure - Source: ConstProp.cpp - Category: ConstProp.cpp Assertion rhsFreeRank == 0 && expect both to recurse to zero at the same time failed. - Reason: onnx-mlir: /build/onnx-mlir/src/Transform/ONNX/ConstProp.cpp:162: void {anonymous}::RecurseConstPropElementwiseBinary(mlir::PatternRewriter&, std::vector<mlir::Attribute, std::allocatormlir::Attribute >&, mlir::DenseElementsAttr, mlir::DenseElementsAttr, llvm::SmallVector<long unsigned int, 4>&, llvm::SmallVector<long unsigned int, 4>&, int, int) [with ElementwiseBinaryOp = mlir::ONNXMulOp]: Assertion `rhsFreeRank == 0 && "expect both to recurse to zero at the same time"' failed. - Return Code:1 - Error output:
onnx-mlir: /build/onnx-mlir/src/Transform/ONNX/ConstProp.cpp:162: void {anonymous}::RecurseConstPropElementwiseBinary(mlir::PatternRewriter&, std::vector<mlir::Attribute, std::allocatormlir::Attribute >&, mlir::DenseElementsAttr, mlir::DenseElementsAttr, llvm::SmallVector<long unsigned int, 4>&, llvm::SmallVector<long unsigned int, 4>&, int, int) [with ElementwiseBinaryOp = mlir::ONNXMulOp]: Assertion `rhsFreeRank == 0 && "expect both to recurse to zero at the same time"' failed.
-ONNNX File: ./mlperf_models/resnet50_v1.onnx - Group: Expected Error - Source: Shape inference failed - Category: Operations couldn't be inferred - Reason: error: Shape inference failed, 2 operations couldn't be inferred - Return Code:1 - Error output:
error: unable to infer shape of operation without shape inference interface error: Shape inference failed, 2 operations couldn't be inferred
- ONNNX File: ./mlperf_models/ssd_mobilenet_v1_coco_2018_01_28.onnx - Group: Others - Source: FrontendDialectHelper.cpp - Category: UNREACHABLE executed at - Reason: UNREACHABLE executed at /build/onnx-mlir/src/Builder/FrontendDialectHelper.cpp:215! - Return Code:1 - Error output:
Failed to import ONNX TensorProto due to unsupported data types. UNREACHABLE executed at /build/onnx-mlir/src/Builder/FrontendDialectHelper.cpp:215!
Can we have a list of which ops have failed shape inference?
Can we also have the data type which we are missing?
Can you re-run the models again please? Some operations have been added maybe they are the ones that were missing.
I wrote script to compile onnx model zoo and mlperf workloads daily with crontab. There is no changes yesterday.
Can we have a list of which ops have failed shape inference?
Can we also have the data type which we are missing?
Can you re-run the models again please? Some operations have been added maybe they are the ones that were missing.
Can we have a list of which ops have failed shape inference?
Can we also have the data type which we are missing?
Can you re-run the models again please? Some operations have been added maybe they are the ones that were missing.
Would it be possible to provide a debug option to print the following things?
- Failed in which pass? Print error on which pass?
- Would it be possible to include operation information in the error? Example: [Pass ABC] [Operation]Error:... Then I can give a summary report.
:)
@chenqiny Would it be possible to provide a debug option to print the following things?
Hi friends,
Is there any bert.onnx -> bert.mlir sample code? when I when to onnx model to test
onnx-mlir --EmitLib bertsquad-10.onnx
error: onnx.OneHot: inferShapes() not implemented
error: shape inference failed