transformers.js
transformers.js copied to clipboard
(fp16 conversion) Attempt to slim model if onnx check model fails
It seems there is a bug in onnx.checker.check_model which throws an error when attempting to perform shape inference on models which run correctly in onnxruntime:
InferenceError: [ShapeInferenceError] Inference error(s): (op_type:If, node name: optimum::if): [ShapeInferenceError] Inference error(s): (op_type:Add, node name: /model/decoder/embed_positions/Add): [ShapeInferenceError] Inferred shape and existing shape differ in rank: (1) vs (0)
Passing the model through onnxslim seems to resolve the issue.