MNN
MNN copied to clipboard
MNN转换qat模型出错
我在使用mnnconvert转换qat模型时,出现如下错误:
Start to Optimize the MNN Net...
**Tensor shape**: 3, 1, 640, 640,
Error for compute convolution shape, inputCount:3, outputCount:16, KH:3, KW:3, group:1
inputChannel: 1, batch:3, width:640, height:640. Input data channel may be mismatch with filter channel count
**Tensor shape**: 3, 1, 640, 640,
Error for compute convolution shape, inputCount:3, outputCount:16, KH:3, KW:3, group:1
inputChannel: 1, batch:3, width:640, height:640. Input data channel may be mismatch with filter channel count
inputTensors : [ images, ]
outputTensors: [ outputs, ]
Converted Success!
虽然转换成功了,但是在运行时,再次出现如下错误:
The device supports: i8sdot:1, fp16:1, i8mm: 0, sve2: 0, sme2: 0
**Tensor shape**: 3, 1, 640, 640,
Error for compute convolution shape, inputCount:3, outputCount:16, KH:3, KW:3, group:1
inputChannel: 1, batch:3, width:640, height:640. Input data channel may be mismatch with filter channel count
Compute Shape Error for /backbone/stem/rbr_reparam/Conv_output_0
Can't run session because not resized
原始模型是什么,怎么进行qat的?
原始模型是什么,怎么进行qat的?
原始模型是yolov6-n,使用nvidia的pytorch_quantization进行的qat。
@afox666 你好!我也碰到这个问题了,请问这个问题是否解决呢?
Marking as stale. No activity in 60 days.