TensorRT
TensorRT copied to clipboard
Could not decode serialized type: np.ndarray. This could be because a required module is missing
Description
I'm trying to generate a calibration cache file for post-training-quantizatio using Polygraphy. For which I created custom input json file referring to this [https://github.com/NVIDIA/TensorRT/blob/main/tools/Polygraphy/how-to/use_custom_input_data.md]. The input shape of the model is (1,3,384,640).
The command used is below -
polygraphy convert model.onnx --int8 --load-inputs custom.json --calibration-cache custom_calib.cache -o model_trt.engine [I] Loading input data from custom.json [!] Could not decode serialized type: np.ndarray. This could be because a required module is missing.
Environment
TensorRT Version: 10.0.1.6-1
NVIDIA GPU: Tesla T4
NVIDIA Driver Version: 470.239.06
CUDA Version: 11.4
CUDNN Version:
Operating System: Ubuntu 20.04.6 LTS
Python Version (if applicable): 3.8.10
Tensorflow Version (if applicable):
PyTorch Version (if applicable):
Baremetal or Container (if so, version):
Relevant Files
Model link:
Steps To Reproduce
Commands or scripts: polygraphy convert model.onnx --int8 --load-inputs custom.json --calibration-cache custom_calib.cache -o model_trt.engine
Have you tried the latest release?:
Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (polygraphy run <model.onnx> --onnxrt): Yes