tensorrt_inference
tensorrt_inference copied to clipboard
INT8
Do you try the INT8 for yolov5?
I try to do onnx ->trt I cant detect any object for yolov5 INT8. But the yolov5 FP16 can detect the object.
Do you have any idea?
@alicera I have not try yet, int8 inference need calib dataset, only build engine is not enough.
@alicera I have not try yet, int8 inference need calib dataset, only build engine is not enough.
大佬,我yolov5的onnx模型做int8量化的时候,是有加入量化数据的,可是还是不出框,你能尝试一下吗?
I meet the same problem.
我试了https://github.com/linghu8812/tensorrt_inference/tree/master/Yolov4 的int8推理,结果是正确的,只是精度不够。最近看了一些TRT的int8量化资料,精度下降太多可能与训练时使用的激活函数有关,猜测是TRT只对正半轴做了8bits quantize,yolov4使用了mish激活函数,所以feature map是存在负值的,yolov4可以使用ReLU替换Leaky-ReLU,我认为这点替换后的精度损失相比于int8的精度下降是可以接受的。还有一点思考就是如果TRT真的只做正半轴的话,sigmoid激活函数可能需要在解析层来做了。所以yolov5 int8应该没问题。
@DaChaoXc Tensorrt使用的是对称量化,就算是非对称量化,也是有zeropoint,不存在你说的仅对正半轴取值