tensorrt-utils
tensorrt-utils copied to clipboard
Is the inference for int8 the same as for fp16?
Hello, thank you for your work!
Do I have a problem with int8-engine inference this way? I loaded the engine file directly, but didn't use the calibration table, I'm a bit y