DDRNet.pytorch
DDRNet.pytorch copied to clipboard
onnxruntime error for onnx converted model
Unable to create an onnxruntime inference session from an onnx exported DDRNet-23-slim model on GPU. Can you provide some support related to this?
Getting the following error: InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Invalid model. Node input '449' is not a graph input, initializer, or output of a previous node.