onnxmltools icon indicating copy to clipboard operation
onnxmltools copied to clipboard

ONNXRuntime failure on models post FP16 conversion through onnxmltools

Open ramkrishna2910 opened this issue 1 year ago • 3 comments

A Pytorch Model converted to onnx using the script below successfully executes through ONNXRuntime. But post FP16 conversion, the model fails with [ONNXRuntimeError] : 1 : FAIL : Type Error: Type parameter (T) of Optype (Add) bound to different types (tensor(float16) and tensor(float) in node () The failure is observed in other models as well. Attached fp16 version of the model. cg_graph_convolutions_43350e0e-op14-opt-f16.onnx.zip

Config: Ubuntu 18.04 Onnxruntime 1.14.1 onnxmltools 1.11.2

import torch
from torch_geometric.datasets import Planetoid
from torch_geometric.nn import CGConv

dataset = Planetoid(root=".", name="Cora")
data = dataset[0]
edge_index_rows = 2

model = CGConv(dataset.num_features)
inputs = {
    "x": torch.zeros(data.num_nodes, data.num_features, dtype=torch.float),
    "edge_index": torch.zeros(edge_index_rows, data.num_nodes, dtype=torch.int64),
}
model.eval()
torch.onnx.export(model, inputs, "cg.onnx", opset_version=14, do_constant_folding=True, input_names = ['input'], output_names = ['output'])

ramkrishna2910 avatar Mar 27 '23 22:03 ramkrishna2910

This issue is related to the converter from torch to onnx. It should be posrted on the repository.

xadupre avatar Apr 17 '23 13:04 xadupre

Can confirm - getting the same issue with the latest.

dbelenko avatar Jul 14 '23 01:07 dbelenko