stevenlix

Results 5 comments of stevenlix

The original float16.py code only picks up float data type and convert it to float16. For your case, you may need to pick up int64 type and do your conversion....

You may use OnnxRuntime for onnx mode inferencing. https://github.com/microsoft/onnxruntime. It provides various execution providers such as CPU, CUDA and TensorRT.

the error message indicates your input shape doesn't match model's input. Please check your input data on 000_net

TensorRT EP can achieve performance parity with native TensorRT. One of the benefits to use TensorRT EP is to run models that can't run in native TensorRT if there are...

One of our internal models failed with the PR changes. The error message is, File "symbolic_shape_infer.py", line 32, in is_sequence assert cls_type in ["tensor_type", "sequence_type"] AssertionError