`auto_convert_mixed_precision` Error: two nodes with same node name error occurred during
I want to convert a model into an AMP model. This is my code:
def convert_float32_to_mixed_precision(fp32_model_path, mixed_precision_model_path):
from onnxconverter_common import auto_mixed_precision
import onnx
model = onnx.load(fp32_model_path)
import numpy as np
np.random.seed(123)
test_data = {"image": 2*np.random.rand(1, 3, 640, 640).astype(np.float32)-1.0,"scale_factor":[[1,1]]}
def validate(res1, res2):
for r1, r2 in zip(res1, res2):
if not np.allclose(r1, r2, rtol=0.01, atol=0.001):
return False
return True
model_fp16 = auto_mixed_precision.auto_convert_mixed_precision(model, test_data, validate, keep_io_types=True)
onnx.save(model_fp16, mixed_precision_model_path)
fp32_model_path = 'F32.onnx'
mixed_precision_model_path = 'AMP.onnx'
print("Convert to mixed precision starts...")
convert_float32_to_mixed_precision(fp32_model_path, mixed_precision_model_path)
print("Conversion finished.")
OR
from onnxconverter_common import auto_mixed_precision
import onnx
import numpy as np
test_data = {"image": 2*np.random.rand(1, 3, 640, 640).astype(np.float32)-1.0,"scale_factor":[[1,1]]}
model = onnx.load("float32.onnx")
model_fp16 = auto_mixed_precision.auto_convert_mixed_precision(model, test_data, rtol=0.01, atol=0.001, keep_io_types=True)
onnx.save(model_fp16, "AMP-float32.onnx")
Using my model to run this code resulted in an error:1 : FAIL : This is an invalid model. Error: two nodes with same node name (_output_cast0). But I don't have duplicate node names. This is my model: Google Drive
@KunMengcode Can I have you model to repro?
@xiaowuhu I am using the ONNX F32 model exported from the PPYOLOE model in the PaddleDetection library. I think it should be equivalent to this?PPYOLOE_pytorch (structurally) And [Baidu Yun Link]:https://pan.baidu.com/s/1BYw-tC3wqA5zpV90XeYvFQ?pwd=el24 [code]:el24
Did you fix this issue. I am fixing this problem
Did you fix this issue. I am fixing this problem
I haven't solved this problem yet
Did you fix this issue. I am fixing this problem
What is your situation @itapty-ily