ficus icon indicating copy to clipboard operation
ficus copied to clipboard

nn: add test_nn_gemm

Open fengyuentau opened this issue 1 year ago • 0 comments

This is the test case that I used when doing the gemm integration to opencv dnn. However, I found the result from ficus nn is different from numpy. Doublechecked with latest opencv dnn (ofc with the gemm integration patch), and its result is the same with numpy. So something is wrong with ficus.

$ ./bin/ficus -run test/test_all.fx
Ficus version: 1.0.0-alpha (git commit: <noinfo>)
Platform: Darwin 22.6.0 arm64
C/C++ Compiler: Apple LLVM 15.0.0 (clang-1500.0.40.1)
[ RUN      ] NN.Gemm.basic
Unexpected result of comparison <Actual>  <Expected>:
Actual: [108.0, 90.0, 135.0;
 80.0, 62.0, 130.0]
Expected: [108.0, 90.0, 135.0;
 80.0, 69.0, 139.0]
[     FAIL ] NN.Gemm.basic (1 ms)

[==========] 1 test(s) ran (1 ms)
[  PASSED  ] 0 test(s)
[  FAILED  ] 1 test(s):
NN.Gemm.basic

Numpy code:

import numpy as np

M = 2
N = 3
K = 4

A = np.array([7.0, 0.0, 8.0, 4.0, 2.0, 2.0, 6.0, 8.0], dtype=np.float32).reshape(M, K)
B = np.array([5.0, 6.0, 6.0, 3.0, 2.0, 7.0, 8.0, 5.0, 8.0, 1.0, 2.0, 7.0], dtype=np.float32).reshape(K, N)
C = np.array([5.0, 0.0, 1.0, 8.0, 7.0, 9.0], dtype=np.float32).reshape(M, N)

print(np.matmul(A, B) + C)

OpenCV DNN code in test_onnx_importer:

TEST_P(Test_ONNX_layers, Gemm00) {
    bool trans_a = false;
    bool trans_b = false;
    float alpha = 1.f;
    float beta = 1.f;

    int M = 2, N = 3, K = 4;

    std::vector<float> A_data{7.0, 0.0, 8.0, 4.0, 2.0, 2.0, 6.0, 8.0};
    Mat A(std::vector<int>{M, K}, CV_32FC1, A_data.data());
    std::vector<float> B_data{5.0, 6.0, 6.0, 3.0, 2.0, 7.0, 8.0, 5.0, 8.0, 1.0, 2.0, 7.0};
    Mat B(std::vector<int>{K, N}, CV_32FC1, B_data.data());
    std::vector<float> C_data{5.0, 0.0, 1.0, 8.0, 7.0, 9.0};
    Mat C(std::vector<int>{M, N}, CV_32FC1, C_data.data());

    LayerParams lp;
    lp.type = "Gemm";
    lp.name = "testLayer";
    lp.set("transA", trans_a);
    lp.set("transB", trans_b);
    lp.set("alpha", alpha);
    lp.set("beta", beta);
    lp.set("real_ndims_C", static_cast<int>(2));

    lp.set("constB", true);
    lp.blobs.push_back(B);
    lp.set("have_bias", true);
    lp.set("constC", true);
    lp.blobs.push_back(C);

    Net net;
    int id = net.addLayerToPrev(lp.name, lp.type, lp);
    net.connect(0, 0, id, 0);
    net.setPreferableBackend(DNN_BACKEND_OPENCV);
    net.setPreferableTarget(DNN_TARGET_CPU);

    net.setInput(A);
    Mat out = net.forward();
    std::cout << "out=";
    for (int i = 0; i < out.total(); i++) {
        std::cout << out.at<float>(i) << " ";
    }
    std::cout << std::endl;
}

fengyuentau avatar Sep 24 '23 09:09 fengyuentau