MNN icon indicating copy to clipboard operation
MNN copied to clipboard

CPU推理错误,但是设置OPENCL作为后端推理回滚CPU后推理结果正确

Open cythamubis opened this issue 1 year ago • 5 comments

平台(如果交叉编译请再附上交叉编译目标平台):

windows X86_64

MNN版本:2.7.1

使用module API

testMNNFromOnnx.py 输出结果: ONNX Model opset version: 12 Start to Optimize the MNN Net... inputTensors : [ image, ] outputTensors: [ embedding, ] Converted Success! Check convert result by onnx, thredhold is 0.01 image output: embedding embedding: (1, 256, 64, 64, ) TESTERROR embedding value error : absMaxV:0.839136 - DiffMax 0.562396 Error for output embedding Save mnn result to .error director

直接设置CPU推理结果错误 CPU

使用OPENCL加速会回滚到CPU,但是此时推理结果正确 OPENCL

The device support i8sdot:0, support fp16:0, support i8mm: 0 model have been loaded... [256 x 256] 1 1 ,256, 256 , 3 openCL not support interp type:3, fallback to cpu openCL not support interp type:3, fallback to cpu openCL not support interp type:3, fallback to cpu

测试模型为efficientvit的image encoder efficientvit

模型下载链接:https://drive.google.com/file/d/16gvLzB9aZ8Zk8TKiJoLW0UgwsWg-r0kc/view?usp=sharing

cythamubis avatar Apr 09 '24 09:04 cythamubis

机器是 x64-avx2 ? 最新版本上结果对么?

jxt1234 avatar Apr 09 '24 11:04 jxt1234

机器是 x64-avx2 ? 最新版本上结果对么?

处理器amd r7 6800H ,要怎么样知道MNN是否使用avx2指令集? 最新版本结果和当前情况一样

cythamubis avatar Apr 09 '24 12:04 cythamubis

@jxt1234 testMNNFromONNX加debug后显示First Error Node is : Resize_709, 模型里的UpsampleLayer使用bicubic插值

部分cpp代码如下

//配置
MNN::BackendConfig backendConfig;
backendConfig.precision = backendConfig.Precision_Low; 
backendConfig.memory = backendConfig.Memory_High;
backendConfig.power = backendConfig.Power_Normal;

MNN::ScheduleConfig config;
config.numThread = 4;
config.backendConfig = &backendConfig;
config.type = typeengine;

exe->setGlobalExecutorConfig(typeengine, backendConfig, 4);

rtmgr = std::shared_ptr<Executor::RuntimeManager>(Executor::RuntimeManager::createRuntimeManager(config));

encoder.reset(Module::load(std::vector<std::string>{}, std::vector<std::string>{}, encoder_path, rtmgr));
sam.reset(Module::load(sam_input, sam_output, decoder_path, rtmgr));

rtmgr->setCache(".cachefile");

//推理
auto input = MNN::Express::_Input({1, inference_size, inference_size, 3}, MNN::Express::NHWC, halide_type_of<float>());

::memcpy(input->writeMap<float>(), processed_image.data, inference_size * inference_size * 3 * sizeof(float));

input = _Convert(input, MNN::Express::NCHW);

auto st = std::chrono::system_clock::now();
auto outputs = encoder->onForward({ input });
auto et = std::chrono::system_clock::now();
auto duration = std::chrono::duration_cast<std::chrono::microseconds>(et - st);
printf("# 1. embedding times: %f ms\n", duration.count() * 1e-3);

image_embedding = _Convert(outputs[0], NCHW);

cythamubis avatar Apr 10 '24 02:04 cythamubis

bicubic改成bilinear后结果正常,看到之前有个类似的问题 #1837

cythamubis avatar Apr 10 '24 08:04 cythamubis

估计是 bicubic 的 avx2 后端实现有问题,我们检查一下

jxt1234 avatar Apr 15 '24 12:04 jxt1234

Marking as stale. No activity in 60 days.

github-actions[bot] avatar Jun 15 '24 09:06 github-actions[bot]