keras2ncnn
                                
                                 keras2ncnn copied to clipboard
                                
                                    keras2ncnn copied to clipboard
                            
                            
                            
                        Activation type _hard_swish is is not supported yet.
Activation _hard_swish in MobileNetV3_Small.
It's a segment model. Model is here!
Cool, let me have a look. It should not be very hard to support hs
Hi. I am not able to find any code that related to _hard_swish activation on keras/tf. Since hard swish need parementers to descripe its shape, and there is none of them descripe in the layer itself, can you provide the code that generate the model, or at least the code for the activation?
Here is a workaround, you can change that to a relu, and manually edit the resulted param like this: https://github.com/Tencent/ncnn/blob/b93775a27273618501a15a235355738cda102a38/benchmark/mobilenet_v3.param#L55
def _hard_swish(self, x):
    """Hard swish
    """
    return x * K.relu(x + 3.0, max_value=6.0) / 6.0
See it... yes, I think you are right. I check the original formula, and I am not able to link the hard swish formula to ncnn's implementation. Let me ask the nihui. https://github.com/Tencent/ncnn/blob/master/src/layer/hardswish.cpp/
It should support in d7d158e, but to be awared, also mentioned in here: https://github.com/Tencent/ncnn/issues/2826#issue-853650801. Keras and pytorch have different defination for the hardsigmoid. Keras using +/- 2.5, but pytorch is using +/- 3. For keras2ncnn, we defuault to use +/- 2.5.
A new problem . How to support user-defined functions, similar to the load function in keras has custom_objects.
    model = k.models.load_model(model_path, compile=False, custom_objects={ 'my_fun': my_fun } )
现在模型转换没问题,但是,用CPP推理直接卡死,然后就挂了(其他类似结构模型用你的代码转换之后能成功推理),所以是不是这个hard_swish哪里没弄好?
A new problem . How to support user-defined functions, similar to the load function in keras has custom_objects.
model = k.models.load_model(model_path, compile=False, custom_objects={ 'my_fun': my_fun } )
There is no way to support custom objects, since they are purely python codes.
Split                    activation_19_Split                0.00ms    |
Pooling                  global_average_pooling2d_7         0.06ms    |     [  1,   9, 288 *1] -> [ 36 *8]
InnerProduct             dense_13                           0.05ms    |               [288 *1] -> [ 36 *8]
InnerProduct             dense_14                           0.04ms    |               [288 *1] -> [ 36 *8]
HardSigmoid              dense_14_HardSigmoid             106.86ms    |               [288 *1] -> [ 36 *8]
Reshape                  reshape_7                         25.04ms    |               [288 *1] -> [  1,  36 *8]
Eltwise                  multiply_7                        12.54ms    |
Convolution              conv2d_19                          0.75ms    |          [  1, 288 *1] -> [  1,  36,  96 *1]         kernel: 1 x 1     stride: 1 x 1
[Thread 0x7fffe8847700 (LWP 2234353) exited]
[Thread 0x7fffe9048700 (LWP 2234352) exited]
[Thread 0x7fffe9849700 (LWP 2234351) exited]
[Thread 0x7fffea04a700 (LWP 2234350) exited]
[Thread 0x7fffea84b700 (LWP 2234349) exited]
[Thread 0x7fffeb04c700 (LWP 2234348) exited]
[Thread 0x7fffeb84d700 (LWP 2234347) exited]
[Thread 0x7fffec04e700 (LWP 2234346) exited]
[Thread 0x7fffec84f700 (LWP 2234345) exited]
[Thread 0x7fffed050700 (LWP 2234344) exited]
[Thread 0x7fffed851700 (LWP 2234343) exited]
[Thread 0x7fffee052700 (LWP 2234342) exited]
[Thread 0x7fffee853700 (LWP 2234341) exited]
[Thread 0x7fffef054700 (LWP 2234340) exited]
[Thread 0x7fffef855700 (LWP 2234339) exited]
[Thread 0x7ffff0056700 (LWP 2234338) exited]
[Thread 0x7ffff0857700 (LWP 2234337) exited]
[Thread 0x7ffff1058700 (LWP 2234336) exited]
[Thread 0x7ffff1859700 (LWP 2234335) exited]
[Thread 0x7ffff205a700 (LWP 2234334) exited]
[Thread 0x7ffff285b700 (LWP 2234333) exited]
[Thread 0x7ffff305c700 (LWP 2234332) exited]
[Thread 0x7ffff385d700 (LWP 2234331) exited]
[Thread 0x7ffff485f700 (LWP 2234329) exited]
[Thread 0x7ffff5060700 (LWP 2234328) exited]
[Thread 0x7ffff5861700 (LWP 2234327) exited]
[Thread 0x7ffff7865700 (LWP 2234323) exited]
[Thread 0x7ffff405e700 (LWP 2234330) exited]
[Thread 0x7ffff7064700 (LWP 2234324) exited]
[Thread 0x7ffff6863700 (LWP 2234325) exited]
[Thread 0x7ffff6062700 (LWP 2234326) exited]
--Type <RET> for more, q to quit, c to continue without paging--
Program terminated with signal SIGKILL, Killed.
The program no longer exists.
Yes... I think it may be ncnn's issue? Let me ask nihui.
I find a stupid bug in Mul operator, fixed it in e3f90c4. Now I am able to run the model.
Convolution              conv2d_1                           7.63ms    |     [160, 160,   3 *1] -> [ 80,  80,   2 *8]         kernel: 3 x 3     stride: 2 x 2
BatchNorm                batch_normalization_1              0.07ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   2 *8]    
HardSwish                activation_1                       0.39ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   2 *8]    
Convolution              conv2d_2                           1.46ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   2 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_2              0.77ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   2 *8]    
Clip                     activation_2_Clip                  1.31ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   2 *8]    
ReLU                     activation_2                       0.84ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   2 *8]    
find_blob_index_by_name activation_2_Split_blob failed
Split                    activation_2_Split                 0.00ms    |
ConvolutionDepthWise     depthwise_conv2d_1                 0.39ms    |     [ 80,  80,  16 *1] -> [ 40,  40,   2 *8]         kernel: 3 x 3     stride: 2 x 2
BatchNorm                batch_normalization_3              1.18ms    |     [ 40,  40,  16 *1] -> [ 40,  40,   2 *8]    
Clip                     activation_3_Clip                  0.61ms    |     [ 40,  40,  16 *1] -> [ 40,  40,   2 *8]    
ReLU                     activation_3                       1.19ms    |     [ 40,  40,  16 *1] -> [ 40,  40,   2 *8]    
find_blob_index_by_name activation_3_Split_blob failed
Split                    activation_3_Split                 0.00ms    |
Pooling                  global_average_pooling2d_1         0.43ms    |     [ 40,  40,  16 *1] -> [  2 *8]              
InnerProduct             dense_1                            0.27ms    |               [ 16 *1] -> [  2 *8]              
InnerProduct             dense_2                            0.19ms    |               [ 16 *1] -> [  2 *8]              
HardSigmoid              dense_2_HardSigmoid                0.46ms    |               [ 16 *1] -> [  2 *8]              
Reshape                  reshape_1                          0.05ms    |               [ 16 *1] -> [  1,   2 *8]         
BinaryOp                 multiply_1                         0.06ms    |
Convolution              conv2d_3                           1.10ms    |     [ 40,  40,  16 *1] -> [ 40,  40,   2 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_4              2.67ms    |     [ 40,  40,  16 *1] -> [ 40,  40,   2 *8]    
Convolution              conv2d_4                           2.88ms    |     [ 40,  40,  16 *1] -> [ 40,  40,   9 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_5              0.38ms    |     [ 40,  40,  72 *1] -> [ 40,  40,   9 *8]    
Clip                     activation_4_Clip                  0.85ms    |     [ 40,  40,  72 *1] -> [ 40,  40,   9 *8]    
ReLU                     activation_4                       0.13ms    |     [ 40,  40,  72 *1] -> [ 40,  40,   9 *8]    
find_blob_index_by_name activation_4_Split_blob failed
Split                    activation_4_Split                 0.00ms    |
ConvolutionDepthWise     depthwise_conv2d_2                 0.18ms    |     [ 40,  40,  72 *1] -> [ 20,  20,   9 *8]         kernel: 3 x 3     stride: 2 x 2
BatchNorm                batch_normalization_6              0.31ms    |     [ 20,  20,  72 *1] -> [ 20,  20,   9 *8]    
Clip                     activation_5_Clip                  0.05ms    |     [ 20,  20,  72 *1] -> [ 20,  20,   9 *8]    
ReLU                     activation_5                       0.18ms    |     [ 20,  20,  72 *1] -> [ 20,  20,   9 *8]    
Convolution              conv2d_5                           1.18ms    |     [ 20,  20,  72 *1] -> [ 20,  20,   3 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_7              0.04ms    |     [ 20,  20,  24 *1] -> [ 20,  20,   3 *8]    
find_blob_index_by_name batch_normalization_7_Split_blob failed
Split                    batch_normalization_7_Split        0.00ms    |
Convolution              conv2d_6                           4.20ms    |     [ 20,  20,  24 *1] -> [ 20,  20,  11 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_8              0.47ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]    
Clip                     activation_6_Clip                  0.42ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]    
ReLU                     activation_6                       0.16ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]    
ConvolutionDepthWise     depthwise_conv2d_3                 1.96ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]         kernel: 3 x 3     stride: 1 x 1
BatchNorm                batch_normalization_9              0.07ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]    
Clip                     activation_7_Clip                  2.42ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]    
ReLU                     activation_7                       0.08ms    |     [ 20,  20,  88 *1] -> [ 20,  20,  11 *8]    
Convolution              conv2d_7                           4.61ms    |     [ 20,  20,  88 *1] -> [ 20,  20,   3 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_10             0.06ms    |     [ 20,  20,  24 *1] -> [ 20,  20,   3 *8]    
BinaryOp                 add_1                              3.77ms    |
Convolution              conv2d_8                           2.31ms    |     [ 20,  20,  24 *1] -> [ 20,  20,  12 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_11             0.02ms    |     [ 20,  20,  96 *1] -> [ 20,  20,  12 *8]    
HardSwish                activation_8                       0.06ms    |     [ 20,  20,  96 *1] -> [ 20,  20,  12 *8]    
find_blob_index_by_name activation_8_Split_blob failed
Split                    activation_8_Split                 0.00ms    |
ConvolutionDepthWise     depthwise_conv2d_4                 0.72ms    |     [ 20,  20,  96 *1] -> [ 10,  10,  12 *8]         kernel: 5 x 5     stride: 2 x 2
BatchNorm                batch_normalization_12             0.06ms    |     [ 10,  10,  96 *1] -> [ 10,  10,  12 *8]    
HardSwish                activation_9                       0.12ms    |     [ 10,  10,  96 *1] -> [ 10,  10,  12 *8]    
find_blob_index_by_name activation_9_Split_blob failed
Split                    activation_9_Split                 0.00ms    |
Pooling                  global_average_pooling2d_2         0.23ms    |     [ 10,  10,  96 *1] -> [ 12 *8]              
InnerProduct             dense_3                            0.17ms    |               [ 96 *1] -> [ 12 *8]              
InnerProduct             dense_4                            0.10ms    |               [ 96 *1] -> [ 12 *8]              
HardSigmoid              dense_4_HardSigmoid                1.05ms    |               [ 96 *1] -> [ 12 *8]              
Reshape                  reshape_2                          0.27ms    |               [ 96 *1] -> [  1,  12 *8]         
BinaryOp                 multiply_2                         0.39ms    |
Convolution              conv2d_9                           0.80ms    |     [ 10,  10,  96 *1] -> [ 10,  10,   5 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_13             2.06ms    |     [ 10,  10,  40 *1] -> [ 10,  10,   5 *8]    
find_blob_index_by_name batch_normalization_13_Split_blob failed
Split                    batch_normalization_13_Split       0.00ms    |
Convolution              conv2d_10                          7.04ms    |     [ 10,  10,  40 *1] -> [ 10,  10,  30 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_14             0.09ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
HardSwish                activation_10                      0.02ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
ConvolutionDepthWise     depthwise_conv2d_5                 0.18ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]         kernel: 5 x 5     stride: 1 x 1
BatchNorm                batch_normalization_15             0.01ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
HardSwish                activation_11                      0.01ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
find_blob_index_by_name activation_11_Split_blob failed
Split                    activation_11_Split                0.00ms    |
Pooling                  global_average_pooling2d_3         0.36ms    |     [ 10,  10, 240 *1] -> [ 30 *8]              
InnerProduct             dense_5                            5.33ms    |               [240 *1] -> [ 30 *8]              
InnerProduct             dense_6                            0.35ms    |               [240 *1] -> [ 30 *8]              
HardSigmoid              dense_6_HardSigmoid                0.24ms    |               [240 *1] -> [ 30 *8]              
Reshape                  reshape_3                          0.44ms    |               [240 *1] -> [  1,  30 *8]         
BinaryOp                 multiply_3                         0.20ms    |
Convolution              conv2d_11                          0.61ms    |     [ 10,  10, 240 *1] -> [ 10,  10,   5 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_16             0.09ms    |     [ 10,  10,  40 *1] -> [ 10,  10,   5 *8]    
BinaryOp                 add_2                              0.15ms    |
find_blob_index_by_name add_2_Split_blob failed
Split                    add_2_Split                        0.00ms    |
Convolution              conv2d_12                          0.78ms    |     [ 10,  10,  40 *1] -> [ 10,  10,  30 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_17             2.89ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
HardSwish                activation_12                      2.45ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
ConvolutionDepthWise     depthwise_conv2d_6                 0.18ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]         kernel: 5 x 5     stride: 1 x 1
BatchNorm                batch_normalization_18             0.17ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
HardSwish                activation_13                      2.18ms    |     [ 10,  10, 240 *1] -> [ 10,  10,  30 *8]    
find_blob_index_by_name activation_13_Split_blob failed
Split                    activation_13_Split                0.00ms    |
Pooling                  global_average_pooling2d_4         0.65ms    |     [ 10,  10, 240 *1] -> [ 30 *8]              
InnerProduct             dense_7                            0.31ms    |               [240 *1] -> [ 30 *8]              
InnerProduct             dense_8                            0.30ms    |               [240 *1] -> [ 30 *8]              
HardSigmoid              dense_8_HardSigmoid                0.20ms    |               [240 *1] -> [ 30 *8]              
Reshape                  reshape_4                          0.35ms    |               [240 *1] -> [  1,  30 *8]         
BinaryOp                 multiply_4                         0.16ms    |
Convolution              conv2d_13                          5.30ms    |     [ 10,  10, 240 *1] -> [ 10,  10,   5 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_19             0.09ms    |     [ 10,  10,  40 *1] -> [ 10,  10,   5 *8]    
BinaryOp                 add_3                              0.36ms    |
Convolution              conv2d_14                          1.81ms    |     [ 10,  10,  40 *1] -> [ 10,  10,  15 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_20             0.29ms    |     [ 10,  10, 120 *1] -> [ 10,  10,  15 *8]    
HardSwish                activation_14                      0.07ms    |     [ 10,  10, 120 *1] -> [ 10,  10,  15 *8]    
ConvolutionDepthWise     depthwise_conv2d_7                 0.22ms    |     [ 10,  10, 120 *1] -> [ 10,  10,  15 *8]         kernel: 5 x 5     stride: 1 x 1
BatchNorm                batch_normalization_21             0.36ms    |     [ 10,  10, 120 *1] -> [ 10,  10,  15 *8]    
HardSwish                activation_15                      0.25ms    |     [ 10,  10, 120 *1] -> [ 10,  10,  15 *8]    
find_blob_index_by_name activation_15_Split_blob failed
Split                    activation_15_Split                0.00ms    |
Pooling                  global_average_pooling2d_5         0.80ms    |     [ 10,  10, 120 *1] -> [ 15 *8]              
InnerProduct             dense_9                            2.61ms    |               [120 *1] -> [ 15 *8]              
InnerProduct             dense_10                           2.11ms    |               [120 *1] -> [ 15 *8]              
HardSigmoid              dense_10_HardSigmoid               3.07ms    |               [120 *1] -> [ 15 *8]              
Reshape                  reshape_5                          3.65ms    |               [120 *1] -> [  1,  15 *8]         
BinaryOp                 multiply_5                         0.10ms    |
Convolution              conv2d_15                          0.60ms    |     [ 10,  10, 120 *1] -> [ 10,  10,   6 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_22             0.04ms    |     [ 10,  10,  48 *1] -> [ 10,  10,   6 *8]    
find_blob_index_by_name batch_normalization_22_Split_blob failed
Split                    batch_normalization_22_Split       0.00ms    |
Convolution              conv2d_16                          0.60ms    |     [ 10,  10,  48 *1] -> [ 10,  10,  18 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_23             0.15ms    |     [ 10,  10, 144 *1] -> [ 10,  10,  18 *8]    
HardSwish                activation_16                      0.21ms    |     [ 10,  10, 144 *1] -> [ 10,  10,  18 *8]    
ConvolutionDepthWise     depthwise_conv2d_8                 0.18ms    |     [ 10,  10, 144 *1] -> [ 10,  10,  18 *8]         kernel: 5 x 5     stride: 1 x 1
BatchNorm                batch_normalization_24             0.05ms    |     [ 10,  10, 144 *1] -> [ 10,  10,  18 *8]    
HardSwish                activation_17                      0.01ms    |     [ 10,  10, 144 *1] -> [ 10,  10,  18 *8]    
find_blob_index_by_name activation_17_Split_blob failed
Split                    activation_17_Split                0.00ms    |
Pooling                  global_average_pooling2d_6         0.39ms    |     [ 10,  10, 144 *1] -> [ 18 *8]              
InnerProduct             dense_11                           0.41ms    |               [144 *1] -> [ 18 *8]              
InnerProduct             dense_12                           0.37ms    |               [144 *1] -> [ 18 *8]              
HardSigmoid              dense_12_HardSigmoid               0.28ms    |               [144 *1] -> [ 18 *8]              
Reshape                  reshape_6                          0.09ms    |               [144 *1] -> [  1,  18 *8]         
BinaryOp                 multiply_6                         0.04ms    |
Convolution              conv2d_17                          0.64ms    |     [ 10,  10, 144 *1] -> [ 10,  10,   6 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_25             0.01ms    |     [ 10,  10,  48 *1] -> [ 10,  10,   6 *8]    
BinaryOp                 add_4                              0.06ms    |
find_blob_index_by_name add_4_Split_blob failed
Split                    add_4_Split                        0.00ms    |
Convolution              conv2d_18                          1.06ms    |     [ 10,  10,  48 *1] -> [ 10,  10,  36 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_26             0.05ms    |     [ 10,  10, 288 *1] -> [ 10,  10,  36 *8]    
HardSwish                activation_18                      0.15ms    |     [ 10,  10, 288 *1] -> [ 10,  10,  36 *8]    
ConvolutionDepthWise     depthwise_conv2d_9                 4.04ms    |     [ 10,  10, 288 *1] -> [  5,   5,  36 *8]         kernel: 5 x 5     stride: 2 x 2
BatchNorm                batch_normalization_27             2.30ms    |     [  5,   5, 288 *1] -> [  5,   5,  36 *8]    
HardSwish                activation_19                      0.55ms    |     [  5,   5, 288 *1] -> [  5,   5,  36 *8]    
find_blob_index_by_name activation_19_Split_blob failed
Split                    activation_19_Split                0.00ms    |
Pooling                  global_average_pooling2d_7         0.62ms    |     [  5,   5, 288 *1] -> [ 36 *8]              
InnerProduct             dense_13                           0.06ms    |               [288 *1] -> [ 36 *8]              
InnerProduct             dense_14                           0.11ms    |               [288 *1] -> [ 36 *8]              
HardSigmoid              dense_14_HardSigmoid               0.20ms    |               [288 *1] -> [ 36 *8]              
Reshape                  reshape_7                          0.21ms    |               [288 *1] -> [  1,  36 *8]         
BinaryOp                 multiply_7                         0.08ms    |
Convolution              conv2d_19                          0.45ms    |     [  5,   5, 288 *1] -> [  5,   5,  12 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_28             0.06ms    |     [  5,   5,  96 *1] -> [  5,   5,  12 *8]    
Convolution              conv_pw_mobile_bottleneck          0.76ms    |     [  5,   5,  96 *1] -> [  5,   5,   4 *8]         kernel: 1 x 1     stride: 1 x 1
Interp                   up_sampling2d_1                    0.14ms    |     [  5,   5,  32 *1] -> [ 10,  10,  32 *1]    
Convolution              conv_pw_upblock_4                  0.62ms    |     [ 10,  10,  48 *1] -> [ 10,  10,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BinaryOp                 add_5                              0.19ms    |
ConvolutionDepthWise     sep_upblock_4b_dw                 19.96ms    |     [ 10,  10,  32 *1] -> [ 10,  10,   4 *8]         kernel: 3 x 3     stride: 1 x 1
Convolution              sep_upblock_4b                     1.04ms    |     [ 10,  10,  32 *1] -> [ 10,  10,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_29             0.03ms    |     [ 10,  10,  32 *1] -> [ 10,  10,   4 *8]    
ReLU                     leaky_re_lu_1                      0.15ms    |     [ 10,  10,  32 *1] -> [ 10,  10,   4 *8]    
Interp                   up_sampling2d_2                    0.32ms    |     [ 10,  10,  32 *1] -> [ 20,  20,  32 *1]    
Convolution              conv_pw_upblock_3                  0.33ms    |     [ 20,  20,  96 *1] -> [ 20,  20,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BinaryOp                 add_6                              0.24ms    |
ConvolutionDepthWise     sep_upblock_3b_dw                  0.64ms    |     [ 20,  20,  32 *1] -> [ 20,  20,   4 *8]         kernel: 3 x 3     stride: 1 x 1
Convolution              sep_upblock_3b                     1.72ms    |     [ 20,  20,  32 *1] -> [ 20,  20,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_30             0.11ms    |     [ 20,  20,  32 *1] -> [ 20,  20,   4 *8]    
ReLU                     leaky_re_lu_2                      0.02ms    |     [ 20,  20,  32 *1] -> [ 20,  20,   4 *8]    
Interp                   up_sampling2d_3                    0.30ms    |     [ 20,  20,  32 *1] -> [ 40,  40,  32 *1]    
Convolution              conv_pw_upblock_2                  0.75ms    |     [ 40,  40,  72 *1] -> [ 40,  40,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BinaryOp                 add_7                              0.30ms    |
ConvolutionDepthWise     sep_upblock_2b_dw                  0.14ms    |     [ 40,  40,  32 *1] -> [ 40,  40,   4 *8]         kernel: 3 x 3     stride: 1 x 1
Convolution              sep_upblock_2b                     0.71ms    |     [ 40,  40,  32 *1] -> [ 40,  40,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_31             0.06ms    |     [ 40,  40,  32 *1] -> [ 40,  40,   4 *8]    
ReLU                     leaky_re_lu_3                      0.01ms    |     [ 40,  40,  32 *1] -> [ 40,  40,   4 *8]    
Interp                   up_sampling2d_4                    0.21ms    |     [ 40,  40,  32 *1] -> [ 80,  80,  32 *1]    
Convolution              conv_pw_upblock_1                  1.46ms    |     [ 80,  80,  16 *1] -> [ 80,  80,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BinaryOp                 add_8                              5.38ms    |
ConvolutionDepthWise     sep_upblock_1b_dw                  0.59ms    |     [ 80,  80,  32 *1] -> [ 80,  80,   4 *8]         kernel: 3 x 3     stride: 1 x 1
Convolution              sep_upblock_1b                     1.50ms    |     [ 80,  80,  32 *1] -> [ 80,  80,   4 *8]         kernel: 1 x 1     stride: 1 x 1
BatchNorm                batch_normalization_32             0.06ms    |     [ 80,  80,  32 *1] -> [ 80,  80,   4 *8]    
ReLU                     leaky_re_lu_4                      0.18ms    |     [ 80,  80,  32 *1] -> [ 80,  80,   4 *8]    
Interp                   up_sampling2d_5                    4.40ms    |     [ 80,  80,  32 *1] -> [160, 160,  32 *1]    
Convolution              final_layer                        7.07ms    |     [160, 160,  32 *1] -> [160, 160,   2 *1]         kernel: 1 x 1     stride: 1 x 1
Softmax                  final_layer_Softmax                4.87ms    |     [160, 160,   2 *1] -> [160, 160,   2 *1]    
Ncnn result error!
result_nomatch.zip
Can you provide the code for running the model? Since you have custom objects, I am not able to run the code throguth keras2ncnn debugger :(
No custom objects, result_nomatch.zip contains inference code with python and c++.
You need model strcuct code ?
hmmm... I can get it running by inserting _relu6 and _hard_swish, here is the output, it seems like we get some issue with the activation :p
==================================
Layer Name: conv2d_1, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->1.349 ncnn->1.349 	Min: keras->-1.747 ncnn->-1.747
Mean: 	keras->-0.079 ncnn->-0.079 	Var: keras->0.368 ncnn->0.368
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.273 0.527 0.297 0.531 0.457 0.368 0.561 0.567 0.475 0.746]
Ncnn Feature Map: 	[0.273 0.527 0.297 0.531 0.457 0.368 0.561 0.567 0.475 0.746]
==================================
Layer Name: batch_normalization_1, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->8.696 ncnn->8.696 	Min: keras->-7.951 ncnn->-7.951
Mean: 	keras->0.137 ncnn->0.137 	Var: keras->1.536 ncnn->1.536
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.469  0.387 -0.39   0.403  0.151 -0.149  0.505  0.525  0.212  1.129]
Ncnn Feature Map: 	[-0.469  0.387 -0.39   0.403  0.151 -0.149  0.505  0.525  0.212  1.129]
==================================
Layer Name: activation_1, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->8.696 ncnn->8.696 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.430 ncnn->0.430 	Var: keras->0.991 ncnn->0.991
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.198  0.218 -0.17   0.228  0.08  -0.071  0.295  0.308  0.113  0.777]
Ncnn Feature Map: 	[-0.198  0.218 -0.17   0.228  0.08  -0.071  0.295  0.308  0.113  0.777]
==================================
Layer Name: conv2d_2, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->12.417 ncnn->12.417 	Min: keras->-10.223 ncnn->-10.223
Mean: 	keras->-0.244 ncnn->-0.244 	Var: keras->1.508 ncnn->1.508
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.838 -0.088  0.228 -0.255  0.213 -0.169  0.52  -2.377 -4.959  1.411]
Ncnn Feature Map: 	[-0.838 -0.088  0.228 -0.255  0.213 -0.169  0.52  -2.377 -4.959  1.411]
==================================
Layer Name: batch_normalization_2, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->11.669 ncnn->11.669 	Min: keras->-11.529 ncnn->-11.529
Mean: 	keras->0.005 ncnn->0.005 	Var: keras->1.396 ncnn->1.396
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.257  0.367  0.631  0.228  0.618  0.301  0.874 -1.538 -3.688  1.616]
Ncnn Feature Map: 	[-0.257  0.367  0.631  0.228  0.618  0.301  0.874 -1.538 -3.688  1.616]
activation_2_Clip
==================================
Layer Name: activation_2, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->6.000 ncnn->6.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.507 ncnn->0.507 	Var: keras->0.796 ncnn->0.796
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.    0.367 0.631 0.228 0.618 0.301 0.874 0.    0.    1.616]
Ncnn Feature Map: 	[0.    0.367 0.631 0.228 0.618 0.301 0.874 0.    0.    1.616]
activation_2_Split
==================================
Layer Name: depthwise_conv2d_1, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->3.493 ncnn->3.493 	Min: keras->-5.866 ncnn->-5.866
Mean: 	keras->0.005 ncnn->0.005 	Var: keras->0.597 ncnn->0.597
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.043 -0.006 -0.226 -0.068  0.132  0.374 -0.162  0.171 -0.122  0.341]
Ncnn Feature Map: 	[ 0.043 -0.006 -0.226 -0.068  0.132  0.374 -0.162  0.171 -0.122  0.341]
==================================
Layer Name: batch_normalization_3, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->15.202 ncnn->15.202 	Min: keras->-14.693 ncnn->-14.693
Mean: 	keras->0.299 ncnn->0.299 	Var: keras->2.209 ncnn->2.209
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.325 -0.415 -3.737 -1.346  1.681  5.342 -2.774  2.269 -2.157  4.837]
Ncnn Feature Map: 	[ 0.325 -0.415 -3.737 -1.346  1.681  5.342 -2.774  2.269 -2.157  4.837]
activation_3_Clip
==================================
Layer Name: activation_3, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->6.000 ncnn->6.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.908 ncnn->0.908 	Var: keras->1.304 ncnn->1.304
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.325 0.    0.    0.    1.681 5.342 0.    2.269 0.    4.837]
Ncnn Feature Map: 	[0.325 0.    0.    0.    1.681 5.342 0.    2.269 0.    4.837]
activation_3_Split
==================================
Layer Name: global_average_pooling2d_1, Layer Shape: keras->(1, 16) ncnn->(1, 1, 16)
Max: 	keras->1.879 ncnn->1.879 	Min: keras->0.187 ncnn->0.187
Mean: 	keras->0.908 ncnn->0.908 	Var: keras->0.483 ncnn->0.483
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.304 1.034 1.778 0.74  0.301 0.97  0.187 1.34  0.359 0.825]
Ncnn Feature Map: 	[1.304 1.034 1.778 0.74  0.301 0.97  0.187 1.34  0.359 0.825]
Top-k:
Keras Top-k: 	13:1.879, 2:1.778, 7:1.340, 0:1.304, 10:1.040
ncnn Top-k: 	13:1.879, 2:1.778, 7:1.340, 0:1.304, 10:1.040
==================================
Layer Name: dense_1, Layer Shape: keras->(1, 16) ncnn->(1, 1, 16)
Max: 	keras->2.866 ncnn->2.866 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->1.122 ncnn->1.122 	Var: keras->1.096 ncnn->1.096
Cosine Similarity: 0.00000
Keras Feature Map: 	[2.866 2.04  1.511 0.    1.259 1.065 0.    2.178 2.643 2.755]
Ncnn Feature Map: 	[2.866 2.04  1.511 0.    1.259 1.065 0.    2.178 2.643 2.755]
Top-k:
Keras Top-k: 	0:2.866, 9:2.755, 8:2.643, 7:2.178, 1:2.040
ncnn Top-k: 	0:2.866, 9:2.755, 8:2.643, 7:2.178, 1:2.040
==================================
Layer Name: dense_2, Layer Shape: keras->(1, 16) ncnn->(1, 1, 16)
Max: 	keras->0.444 ncnn->0.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.184 ncnn->0.000 	Var: keras->0.140 ncnn->0.000
Cosine Similarity: nan
Keras Feature Map: 	[0.    0.248 0.169 0.063 0.444 0.231 0.045 0.    0.247 0.255]
Ncnn Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Top-k:
Keras Top-k: 	4:0.444, 12:0.437, 10:0.358, 9:0.255, 1:0.248
ncnn Top-k: 	15:0.000, 14:0.000, 13:0.000, 12:0.000, 11:0.000
dense_2_HardSigmoid
==================================
Layer Name: reshape_1, Layer Shape: keras->(1, 1, 1, 16) ncnn->(1, 16, 1)
Max: 	keras->0.444 ncnn->0.455 	Min: keras->0.000 ncnn->0.455
Mean: 	keras->0.184 ncnn->0.455 	Var: keras->0.140 ncnn->0.000
Cosine Similarity: 0.20266
Keras Feature Map: 	[0.]
Ncnn Feature Map: 	[0.455 0.455 0.455 0.455 0.455 0.455 0.455 0.455 0.455 0.455]
==================================
Layer Name: multiply_1, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->1.491 ncnn->2.727 	Min: keras->0.000 ncnn->-0.000
Mean: 	keras->0.147 ncnn->0.015 	Var: keras->0.231 ncnn->0.143
Cosine Similarity: 0.85162
Keras Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Ncnn Feature Map: 	[ 0.148  0.     0.     0.     0.    -0.     0.     0.     0.     0.   ]
==================================
Layer Name: conv2d_3, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->1.563 ncnn->3.544 	Min: keras->-1.326 ncnn->-2.929
Mean: 	keras->-0.054 ncnn->-0.000 	Var: keras->0.331 ncnn->0.192
Cosine Similarity: 0.84233
Keras Feature Map: 	[-0.085 -0.094 -0.159 -0.157 -0.55  -0.474 -0.494 -0.339  0.021 -0.043]
Ncnn Feature Map: 	[-0.767  0.282 -0.002 -0.002 -0.002 -0.002 -0.002 -0.002 -0.002 -0.002]
==================================
Layer Name: batch_normalization_4, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->3.067 ncnn->7.158 	Min: keras->-2.500 ncnn->-7.686
Mean: 	keras->0.088 ncnn->0.239 	Var: keras->0.599 ncnn->0.756
Cosine Similarity: 0.84810
Keras Feature Map: 	[ 0.572  0.552  0.397  0.402 -0.529 -0.35  -0.396 -0.03   0.824  0.674]
Ncnn Feature Map: 	[-1.045  1.445  0.769  0.769  0.769  0.769  0.769  0.769  0.769  0.769]
==================================
Layer Name: conv2d_4, Layer Shape: keras->(1, 40, 40, 72) ncnn->(72, 40, 40)
Max: 	keras->3.235 ncnn->7.332 	Min: keras->-3.489 ncnn->-8.621
Mean: 	keras->0.042 ncnn->0.269 	Var: keras->0.595 ncnn->0.757
Cosine Similarity: 0.87357
Keras Feature Map: 	[ 0.18   0.419 -0.184  0.568 -0.607  0.934 -0.008  0.865  0.367 -0.19 ]
Ncnn Feature Map: 	[-1.083 -0.048  1.029  1.029  1.029  1.029  1.029  1.029  1.029  1.029]
==================================
Layer Name: batch_normalization_5, Layer Shape: keras->(1, 40, 40, 72) ncnn->(72, 40, 40)
Max: 	keras->2.372 ncnn->5.813 	Min: keras->-3.108 ncnn->-7.594
Mean: 	keras->0.051 ncnn->0.235 	Var: keras->0.469 ncnn->0.546
Cosine Similarity: 0.79870
Keras Feature Map: 	[-0.068  0.1   -0.324  0.205 -0.622  0.462 -0.2    0.414  0.063 -0.329]
Ncnn Feature Map: 	[-0.956 -0.229  0.529  0.529  0.529  0.529  0.529  0.529  0.529  0.529]
activation_4_Clip
==================================
Layer Name: activation_4, Layer Shape: keras->(1, 40, 40, 72) ncnn->(72, 40, 40)
Max: 	keras->2.372 ncnn->5.813 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.209 ncnn->0.355 	Var: keras->0.280 ncnn->0.354
Cosine Similarity: 0.48401
Keras Feature Map: 	[0.    0.1   0.    0.205 0.    0.462 0.    0.414 0.063 0.   ]
Ncnn Feature Map: 	[0.    0.    0.529 0.529 0.529 0.529 0.529 0.529 0.529 0.529]
activation_4_Split
==================================
Layer Name: depthwise_conv2d_2, Layer Shape: keras->(1, 20, 20, 72) ncnn->(72, 20, 20)
Max: 	keras->0.717 ncnn->1.395 	Min: keras->-0.813 ncnn->-1.729
Mean: 	keras->-0.001 ncnn->0.037 	Var: keras->0.165 ncnn->0.251
Cosine Similarity: 0.41442
Keras Feature Map: 	[ 0.037  0.094  0.093 -0.019 -0.143  0.075 -0.019  0.045  0.069 -0.034]
Ncnn Feature Map: 	[-0.105 -0.011 -0.011 -0.011 -0.011 -0.011 -0.011 -0.011 -0.011 -0.011]
==================================
Layer Name: batch_normalization_6, Layer Shape: keras->(1, 20, 20, 72) ncnn->(72, 20, 20)
Max: 	keras->4.217 ncnn->6.105 	Min: keras->-4.771 ncnn->-7.202
Mean: 	keras->0.017 ncnn->0.187 	Var: keras->0.652 ncnn->0.886
Cosine Similarity: 0.83393
Keras Feature Map: 	[ 0.341  1.237  1.209 -0.533 -2.466  0.928 -0.525  0.467  0.838 -0.763]
Ncnn Feature Map: 	[-1.873 -0.406 -0.406 -0.406 -0.406 -0.406 -0.406 -0.406 -0.406 -0.406]
activation_5_Clip
==================================
Layer Name: activation_5, Layer Shape: keras->(1, 20, 20, 72) ncnn->(72, 20, 20)
Max: 	keras->4.217 ncnn->6.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.262 ncnn->0.423 	Var: keras->0.379 ncnn->0.585
Cosine Similarity: 0.57143
Keras Feature Map: 	[0.341 1.237 1.209 0.    0.    0.928 0.    0.467 0.838 0.   ]
Ncnn Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
==================================
Layer Name: conv2d_5, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->2.272 ncnn->4.549 	Min: keras->-2.808 ncnn->-5.844
Mean: 	keras->-0.136 ncnn->-0.277 	Var: keras->0.747 ncnn->1.455
Cosine Similarity: 0.55830
Keras Feature Map: 	[-0.868 -0.985 -0.853 -2.025 -1.587 -1.621 -0.921 -0.725  0.099 -1.392]
Ncnn Feature Map: 	[-0.468 -1.649 -1.649 -1.649 -1.649 -1.649 -1.649 -1.649 -1.649 -1.649]
==================================
Layer Name: batch_normalization_7, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->1.903 ncnn->3.072 	Min: keras->-2.691 ncnn->-2.862
Mean: 	keras->-0.022 ncnn->-0.092 	Var: keras->0.538 ncnn->0.723
Cosine Similarity: 1.07889
Keras Feature Map: 	[-0.094 -0.169 -0.084 -0.83  -0.552 -0.573 -0.128 -0.003  0.521 -0.428]
Ncnn Feature Map: 	[ 0.16  -0.591 -0.591 -0.591 -0.591 -0.591 -0.591 -0.591 -0.591 -0.591]
batch_normalization_7_Split
==================================
Layer Name: conv2d_6, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.724 ncnn->4.336 	Min: keras->-2.292 ncnn->-4.941
Mean: 	keras->0.219 ncnn->0.093 	Var: keras->0.609 ncnn->0.895
Cosine Similarity: 1.01601
Keras Feature Map: 	[ 0.664  0.039  0.752  0.964  0.436 -0.067  0.377  0.277 -0.266  0.023]
Ncnn Feature Map: 	[-1.711  1.012  1.012  1.012  1.012  1.012  1.012  1.012  1.012  1.012]
==================================
Layer Name: batch_normalization_8, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.595 ncnn->3.739 	Min: keras->-2.052 ncnn->-4.448
Mean: 	keras->0.289 ncnn->0.142 	Var: keras->0.550 ncnn->0.730
Cosine Similarity: 0.96693
Keras Feature Map: 	[ 0.74   0.134  0.826  1.032  0.519  0.031  0.462  0.365 -0.162  0.118]
Ncnn Feature Map: 	[-1.564  1.079  1.079  1.079  1.079  1.079  1.079  1.079  1.079  1.079]
activation_6_Clip
==================================
Layer Name: activation_6, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.595 ncnn->3.739 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.394 ncnn->0.378 	Var: keras->0.410 ncnn->0.442
Cosine Similarity: 0.57741
Keras Feature Map: 	[0.74  0.134 0.826 1.032 0.519 0.031 0.462 0.365 0.    0.118]
Ncnn Feature Map: 	[0.    1.079 1.079 1.079 1.079 1.079 1.079 1.079 1.079 1.079]
==================================
Layer Name: depthwise_conv2d_3, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->0.873 ncnn->1.541 	Min: keras->-1.067 ncnn->-1.110
Mean: 	keras->-0.028 ncnn->-0.025 	Var: keras->0.192 ncnn->0.186
Cosine Similarity: 0.63346
Keras Feature Map: 	[ 0.036 -0.011  0.189 -0.088 -0.257 -0.096  0.082 -0.078 -0.062  0.206]
Ncnn Feature Map: 	[ 0.183  0.02  -0.096 -0.096 -0.096 -0.096 -0.096 -0.096 -0.096 -0.096]
==================================
Layer Name: batch_normalization_9, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.790 ncnn->4.614 	Min: keras->-4.029 ncnn->-3.684
Mean: 	keras->-0.024 ncnn->-0.047 	Var: keras->0.626 ncnn->0.633
Cosine Similarity: 0.78929
Keras Feature Map: 	[ 0.176 -0.06   0.953 -0.452 -1.315 -0.496  0.41  -0.404 -0.322  1.041]
Ncnn Feature Map: 	[ 0.923  0.097 -0.497 -0.497 -0.497 -0.497 -0.497 -0.497 -0.497 -0.497]
activation_7_Clip
==================================
Layer Name: activation_7, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.790 ncnn->4.614 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.232 ncnn->0.202 	Var: keras->0.348 ncnn->0.387
Cosine Similarity: 0.55698
Keras Feature Map: 	[0.176 0.    0.953 0.    0.    0.    0.41  0.    0.    1.041]
Ncnn Feature Map: 	[0.923 0.097 0.    0.    0.    0.    0.    0.    0.    0.   ]
==================================
Layer Name: conv2d_7, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->3.889 ncnn->3.807 	Min: keras->-2.990 ncnn->-3.163
Mean: 	keras->-0.107 ncnn->-0.157 	Var: keras->0.735 ncnn->0.862
Cosine Similarity: 0.65611
Keras Feature Map: 	[-0.909 -0.839 -1.203  0.136 -1.033 -0.016 -0.018  0.113  0.438  0.264]
Ncnn Feature Map: 	[ 0.427  0.314 -0.004 -0.004 -0.004 -0.004 -0.004 -0.004 -0.004 -0.004]
==================================
Layer Name: batch_normalization_10, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->2.404 ncnn->2.792 	Min: keras->-2.459 ncnn->-2.485
Mean: 	keras->-0.139 ncnn->-0.166 	Var: keras->0.601 ncnn->0.737
Cosine Similarity: 0.54959
Keras Feature Map: 	[-0.128 -0.068 -0.381  0.773 -0.235  0.641  0.64   0.753  1.033  0.883]
Ncnn Feature Map: 	[1.024 0.926 0.652 0.652 0.652 0.652 0.652 0.652 0.652 0.652]
==================================
Layer Name: add_1, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->3.039 ncnn->3.681 	Min: keras->-3.317 ncnn->-3.609
Mean: 	keras->-0.161 ncnn->-0.258 	Var: keras->0.886 ncnn->0.921
Cosine Similarity: 0.71792
Keras Feature Map: 	[-0.222 -0.236 -0.465 -0.058 -0.787  0.068  0.512  0.75   1.554  0.455]
Ncnn Feature Map: 	[1.184 0.335 0.061 0.061 0.061 0.061 0.061 0.061 0.061 0.061]
==================================
Layer Name: conv2d_8, Layer Shape: keras->(1, 20, 20, 96) ncnn->(96, 20, 20)
Max: 	keras->4.521 ncnn->4.625 	Min: keras->-3.424 ncnn->-4.257
Mean: 	keras->0.315 ncnn->0.241 	Var: keras->0.967 ncnn->1.070
Cosine Similarity: 0.61364
Keras Feature Map: 	[ 1.64   1.558  0.992 -0.494  0.449  1.736  2.111  2.111  2.707  1.929]
Ncnn Feature Map: 	[1.605 2.614 1.828 1.828 1.828 1.828 1.828 1.828 1.828 1.828]
==================================
Layer Name: batch_normalization_11, Layer Shape: keras->(1, 20, 20, 96) ncnn->(96, 20, 20)
Max: 	keras->2.817 ncnn->2.564 	Min: keras->-2.428 ncnn->-2.765
Mean: 	keras->-0.062 ncnn->-0.095 	Var: keras->0.577 ncnn->0.622
Cosine Similarity: 0.61508
Keras Feature Map: 	[ 0.646  0.602  0.297 -0.503  0.005  0.697  0.9    0.899  1.22   0.802]
Ncnn Feature Map: 	[0.627 1.17  0.747 0.747 0.747 0.747 0.747 0.747 0.747 0.747]
==================================
Layer Name: activation_8, Layer Shape: keras->(1, 20, 20, 96) ncnn->(96, 20, 20)
Max: 	keras->2.731 ncnn->2.378 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.025 ncnn->0.018 	Var: keras->0.287 ncnn->0.294
Cosine Similarity: 0.62037
Keras Feature Map: 	[ 0.393  0.361  0.163 -0.209  0.002  0.43   0.585  0.584  0.858  0.508]
Ncnn Feature Map: 	[0.379 0.813 0.467 0.467 0.467 0.467 0.467 0.467 0.467 0.467]
activation_8_Split
==================================
Layer Name: depthwise_conv2d_4, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->1.250 ncnn->1.002 	Min: keras->-1.033 ncnn->-1.064
Mean: 	keras->0.014 ncnn->0.017 	Var: keras->0.253 ncnn->0.288
Cosine Similarity: 0.53885
Keras Feature Map: 	[ 0.583 -0.209 -0.053  0.1    0.353  0.096  0.253  0.113 -0.061  0.236]
Ncnn Feature Map: 	[ 0.451  0.109  0.06   0.06   0.06   0.06   0.06   0.06   0.127 -0.109]
==================================
Layer Name: batch_normalization_12, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->3.476 ncnn->2.854 	Min: keras->-3.249 ncnn->-3.294
Mean: 	keras->0.039 ncnn->0.036 	Var: keras->0.605 ncnn->0.681
Cosine Similarity: 0.58202
Keras Feature Map: 	[ 1.84  -0.872 -0.337  0.184  1.053  0.171  0.708  0.23  -0.365  0.652]
Ncnn Feature Map: 	[ 1.387  0.216  0.047  0.047  0.047  0.047  0.047  0.047  0.278 -0.53 ]
==================================
Layer Name: activation_9, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->3.476 ncnn->2.784 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.081 ncnn->0.095 	Var: keras->0.358 ncnn->0.414
Cosine Similarity: 0.50234
Keras Feature Map: 	[ 1.485 -0.309 -0.149  0.097  0.711  0.09   0.437  0.124 -0.16   0.397]
Ncnn Feature Map: 	[ 1.014  0.116  0.024  0.024  0.024  0.024  0.024  0.024  0.152 -0.218]
activation_9_Split
==================================
Layer Name: global_average_pooling2d_2, Layer Shape: keras->(1, 96) ncnn->(1, 1, 96)
Max: 	keras->1.469 ncnn->1.668 	Min: keras->-0.300 ncnn->-0.277
Mean: 	keras->0.081 ncnn->0.095 	Var: keras->0.283 ncnn->0.337
Cosine Similarity: 0.36094
Keras Feature Map: 	[-0.    -0.051 -0.075  0.079 -0.171 -0.054 -0.042 -0.222  0.719 -0.257]
Ncnn Feature Map: 	[-0.048 -0.01  -0.101  0.083 -0.088 -0.139 -0.086 -0.09   0.159 -0.156]
Top-k:
Keras Top-k: 	51:1.469, 80:0.972, 88:0.859, 8:0.719, 36:0.640
ncnn Top-k: 	51:1.668, 53:1.538, 80:0.920, 14:0.865, 82:0.800
==================================
Layer Name: dense_3, Layer Shape: keras->(1, 96) ncnn->(1, 1, 96)
Max: 	keras->1.656 ncnn->1.847 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.170 ncnn->0.280 	Var: keras->0.349 ncnn->0.500
Cosine Similarity: 0.12646
Keras Feature Map: 	[0.    0.    0.    0.645 0.    0.    0.212 0.85  1.151 0.   ]
Ncnn Feature Map: 	[0.    0.    0.    1.086 0.    0.    1.206 1.352 1.847 0.   ]
Top-k:
Keras Top-k: 	49:1.656, 27:1.375, 8:1.151, 67:1.130, 72:0.977
ncnn Top-k: 	8:1.847, 67:1.826, 88:1.524, 72:1.515, 27:1.475
==================================
Layer Name: dense_4, Layer Shape: keras->(1, 96) ncnn->(1, 1, 96)
Max: 	keras->1.000 ncnn->4.878 	Min: keras->0.017 ncnn->0.000
Mean: 	keras->0.643 ncnn->1.499 	Var: keras->0.230 ncnn->1.369
Cosine Similarity: 0.11675
Keras Feature Map: 	[0.484 0.804 0.896 0.017 0.127 0.605 0.735 0.521 0.778 0.656]
Ncnn Feature Map: 	[0.05  1.56  2.878 0.    0.    0.73  2.092 0.374 3.771 0.571]
Top-k:
Keras Top-k: 	50:1.000, 90:1.000, 20:1.000, 46:1.000, 91:0.990
ncnn Top-k: 	46:4.878, 90:4.454, 50:4.105, 91:4.069, 20:3.842
dense_4_HardSigmoid
==================================
Layer Name: reshape_2, Layer Shape: keras->(1, 1, 1, 96) ncnn->(1, 96, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.017 ncnn->0.455
Mean: 	keras->0.643 ncnn->0.705 	Var: keras->0.230 ncnn->0.213
Cosine Similarity: 0.01708
Keras Feature Map: 	[0.484]
Ncnn Feature Map: 	[0.464 0.738 0.978 0.455 0.455 0.587 0.835 0.523 1.    0.558]
==================================
Layer Name: multiply_2, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->2.636 ncnn->2.461 	Min: keras->-0.371 ncnn->-0.480
Mean: 	keras->0.050 ncnn->0.051 	Var: keras->0.244 ncnn->0.258
Cosine Similarity: 0.71633
Keras Feature Map: 	[ 0.719 -0.15  -0.072  0.047  0.344  0.044  0.212  0.06  -0.078  0.192]
Ncnn Feature Map: 	[ 0.47   0.116  0.018  0.016  0.024  0.016  0.02   0.02   0.137 -0.099]
==================================
Layer Name: conv2d_9, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.295 ncnn->2.115 	Min: keras->-2.253 ncnn->-2.456
Mean: 	keras->-0.033 ncnn->0.077 	Var: keras->0.576 ncnn->0.607
Cosine Similarity: 0.64516
Keras Feature Map: 	[1.123 0.761 0.532 0.473 0.426 0.504 0.595 0.665 0.812 1.215]
Ncnn Feature Map: 	[-0.471  0.7    0.534  0.231  0.322  0.238  0.141  0.474  0.212  0.096]
==================================
Layer Name: batch_normalization_13, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.738 ncnn->2.823 	Min: keras->-2.600 ncnn->-2.537
Mean: 	keras->-0.002 ncnn->0.147 	Var: keras->0.708 ncnn->0.757
Cosine Similarity: 0.76708
Keras Feature Map: 	[1.484 1.016 0.721 0.645 0.584 0.685 0.802 0.892 1.082 1.603]
Ncnn Feature Map: 	[-0.573  0.937  0.724  0.332  0.451  0.341  0.217  0.646  0.309  0.159]
batch_normalization_13_Split
==================================
Layer Name: conv2d_10, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->4.744 ncnn->4.844 	Min: keras->-3.609 ncnn->-5.023
Mean: 	keras->0.088 ncnn->-0.050 	Var: keras->1.002 ncnn->1.059
Cosine Similarity: 0.77914
Keras Feature Map: 	[1.362 1.119 1.398 1.128 1.381 0.941 2.454 1.242 1.175 1.389]
Ncnn Feature Map: 	[0.957 2.635 1.397 1.554 1.715 1.276 0.185 0.795 0.119 1.12 ]
==================================
Layer Name: batch_normalization_14, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.461 ncnn->3.488 	Min: keras->-3.410 ncnn->-4.408
Mean: 	keras->-0.019 ncnn->-0.131 	Var: keras->0.823 ncnn->0.839
Cosine Similarity: 0.69403
Keras Feature Map: 	[0.896 0.734 0.92  0.74  0.909 0.616 1.623 0.816 0.772 0.914]
Ncnn Feature Map: 	[0.626 1.744 0.919 1.024 1.131 0.839 0.112 0.518 0.068 0.735]
==================================
Layer Name: activation_10, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.461 ncnn->3.488 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.103 ncnn->0.054 	Var: keras->0.436 ncnn->0.416
Cosine Similarity: 0.71712
Keras Feature Map: 	[0.582 0.457 0.601 0.461 0.592 0.371 1.251 0.519 0.485 0.596]
Ncnn Feature Map: 	[0.379 1.379 0.6   0.686 0.779 0.537 0.058 0.304 0.035 0.458]
==================================
Layer Name: depthwise_conv2d_5, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->2.406 ncnn->2.062 	Min: keras->-2.650 ncnn->-2.191
Mean: 	keras->-0.031 ncnn->-0.020 	Var: keras->0.353 ncnn->0.307
Cosine Similarity: 0.63960
Keras Feature Map: 	[-0.925 -0.671 -0.34  -0.279 -0.423 -0.151 -0.409 -0.56  -0.118  0.067]
Ncnn Feature Map: 	[-0.839 -0.976 -0.39  -0.096 -0.094 -0.016  0.231 -0.077 -0.055 -0.068]
==================================
Layer Name: batch_normalization_15, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.631 ncnn->3.870 	Min: keras->-4.769 ncnn->-5.017
Mean: 	keras->-0.125 ncnn->-0.104 	Var: keras->0.692 ncnn->0.696
Cosine Similarity: 0.65915
Keras Feature Map: 	[-1.926 -1.327 -0.549 -0.405 -0.743 -0.102 -0.71  -1.067 -0.024  0.41 ]
Ncnn Feature Map: 	[-1.724 -2.046 -0.666  0.026  0.031  0.216  0.797  0.072  0.124  0.093]
==================================
Layer Name: activation_11, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.631 ncnn->3.870 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.020 ncnn->0.029 	Var: keras->0.323 ncnn->0.327
Cosine Similarity: 0.64940
Keras Feature Map: 	[-0.345 -0.37  -0.224 -0.175 -0.279 -0.049 -0.271 -0.344 -0.012  0.233]
Ncnn Feature Map: 	[-0.367 -0.325 -0.259  0.013  0.016  0.116  0.504  0.037  0.065  0.048]
activation_11_Split
==================================
Layer Name: global_average_pooling2d_3, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.059 ncnn->0.910 	Min: keras->-0.365 ncnn->-0.366
Mean: 	keras->0.020 ncnn->0.029 	Var: keras->0.233 ncnn->0.206
Cosine Similarity: 0.45053
Keras Feature Map: 	[-0.237 -0.214 -0.044 -0.226  0.362  0.044  0.119  0.052  0.215 -0.088]
Ncnn Feature Map: 	[-0.119 -0.004 -0.042 -0.072  0.172 -0.011  0.215  0.034 -0.242 -0.032]
Top-k:
Keras Top-k: 	40:1.059, 17:1.006, 188:0.954, 119:0.905, 58:0.748
ncnn Top-k: 	17:0.910, 182:0.732, 238:0.683, 119:0.573, 58:0.563
==================================
Layer Name: dense_5, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->2.975 ncnn->2.143 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.156 ncnn->0.173 	Var: keras->0.480 ncnn->0.447
Cosine Similarity: 0.24803
Keras Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Ncnn Feature Map: 	[0.046 0.    0.    0.    0.    0.    0.    0.    0.    0.   ]
Top-k:
Keras Top-k: 	135:2.975, 143:2.787, 89:2.622, 139:2.540, 66:2.306
ncnn Top-k: 	136:2.143, 190:2.112, 135:2.031, 92:1.992, 167:1.841
==================================
Layer Name: dense_6, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.000 ncnn->5.566 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.595 ncnn->1.227 	Var: keras->0.319 ncnn->1.490
Cosine Similarity: 0.17060
Keras Feature Map: 	[0.548 0.348 0.502 0.674 0.201 1.    0.79  0.    0.698 0.578]
Ncnn Feature Map: 	[0.    0.407 1.471 0.668 0.    3.57  3.807 0.    0.227 0.065]
Top-k:
Keras Top-k: 	119:1.000, 204:1.000, 213:1.000, 61:1.000, 211:1.000
ncnn Top-k: 	225:5.566, 81:5.214, 218:5.190, 53:5.037, 20:4.962
dense_6_HardSigmoid
==================================
Layer Name: reshape_3, Layer Shape: keras->(1, 1, 1, 240) ncnn->(1, 240, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.455
Mean: 	keras->0.595 ncnn->0.646 	Var: keras->0.319 ncnn->0.210
Cosine Similarity: 0.04976
Keras Feature Map: 	[0.548]
Ncnn Feature Map: 	[0.455 0.528 0.722 0.576 0.455 1.    1.    0.455 0.496 0.466]
==================================
Layer Name: multiply_3, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.631 ncnn->2.651 	Min: keras->-0.375 ncnn->-0.471
Mean: 	keras->0.023 ncnn->0.015 	Var: keras->0.242 ncnn->0.209
Cosine Similarity: 0.73389
Keras Feature Map: 	[-0.189 -0.203 -0.123 -0.096 -0.153 -0.027 -0.148 -0.188 -0.007  0.128]
Ncnn Feature Map: 	[-0.167 -0.161 -0.223  0.007  0.007  0.116  0.321  0.025  0.029  0.041]
==================================
Layer Name: conv2d_11, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.978 ncnn->2.403 	Min: keras->-2.601 ncnn->-2.769
Mean: 	keras->0.063 ncnn->0.007 	Var: keras->0.872 ncnn->0.616
Cosine Similarity: 0.67813
Keras Feature Map: 	[1.139 0.478 0.331 0.807 0.603 0.808 0.914 0.989 0.978 0.956]
Ncnn Feature Map: 	[2.403 0.516 0.    0.14  0.268 0.177 0.39  0.991 0.501 0.617]
==================================
Layer Name: batch_normalization_16, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.349 ncnn->2.171 	Min: keras->-2.250 ncnn->-2.244
Mean: 	keras->0.059 ncnn->0.007 	Var: keras->0.691 ncnn->0.515
Cosine Similarity: 0.75922
Keras Feature Map: 	[1.073 0.498 0.371 0.784 0.607 0.785 0.877 0.942 0.933 0.914]
Ncnn Feature Map: 	[2.171 0.531 0.083 0.204 0.315 0.237 0.422 0.944 0.518 0.619]
==================================
Layer Name: add_2, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->3.798 ncnn->3.669 	Min: keras->-3.213 ncnn->-4.008
Mean: 	keras->0.057 ncnn->0.154 	Var: keras->1.105 ncnn->0.918
Cosine Similarity: 0.73792
Keras Feature Map: 	[2.557 1.514 1.092 1.43  1.191 1.47  1.679 1.834 2.015 2.516]
Ncnn Feature Map: 	[1.597 1.468 0.807 0.537 0.766 0.578 0.64  1.59  0.827 0.777]
add_2_Split
==================================
Layer Name: conv2d_12, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->5.700 ncnn->5.649 	Min: keras->-5.146 ncnn->-5.266
Mean: 	keras->0.111 ncnn->0.022 	Var: keras->1.410 ncnn->1.157
Cosine Similarity: 0.72483
Keras Feature Map: 	[3.894 1.588 0.957 1.504 1.118 1.124 0.215 0.823 0.036 0.035]
Ncnn Feature Map: 	[-0.756  1.12   1.815  0.524  0.325  0.134 -0.612  0.177 -0.113 -1.147]
==================================
Layer Name: batch_normalization_17, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.964 ncnn->3.490 	Min: keras->-3.365 ncnn->-3.418
Mean: 	keras->-0.061 ncnn->-0.118 	Var: keras->0.872 ncnn->0.714
Cosine Similarity: 0.67744
Keras Feature Map: 	[1.91  0.801 0.498 0.76  0.575 0.578 0.141 0.433 0.055 0.054]
Ncnn Feature Map: 	[-0.326  0.576  0.91   0.289  0.194  0.102 -0.257  0.122 -0.017 -0.514]
==================================
Layer Name: activation_12, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.964 ncnn->3.490 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.097 ncnn->0.028 	Var: keras->0.470 ncnn->0.368
Cosine Similarity: 0.71403
Keras Feature Map: 	[1.563 0.507 0.29  0.477 0.343 0.345 0.074 0.248 0.028 0.028]
Ncnn Feature Map: 	[-0.145  0.343  0.593  0.159  0.103  0.053 -0.117  0.064 -0.009 -0.213]
==================================
Layer Name: depthwise_conv2d_6, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->2.642 ncnn->2.222 	Min: keras->-3.390 ncnn->-4.451
Mean: 	keras->-0.014 ncnn->-0.023 	Var: keras->0.380 ncnn->0.301
Cosine Similarity: 0.66426
Keras Feature Map: 	[-0.132 -0.291 -0.139 -0.167 -0.239 -0.187 -0.056 -0.018 -0.054 -0.044]
Ncnn Feature Map: 	[-0.03   0.034 -0.039 -0.136 -0.07  -0.025  0.022  0.051 -0.016  0.003]
==================================
Layer Name: batch_normalization_18, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->5.341 ncnn->6.026 	Min: keras->-5.191 ncnn->-7.821
Mean: 	keras->-0.129 ncnn->-0.153 	Var: keras->0.785 ncnn->0.644
Cosine Similarity: 0.63899
Keras Feature Map: 	[-0.227 -0.571 -0.242 -0.302 -0.457 -0.345 -0.062  0.021 -0.058 -0.036]
Ncnn Feature Map: 	[-0.006  0.133 -0.025 -0.236 -0.091  0.005  0.108  0.171  0.025  0.066]
==================================
Layer Name: activation_13, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->5.341 ncnn->6.026 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.038 ncnn->-0.006 	Var: keras->0.412 ncnn->0.307
Cosine Similarity: 0.67805
Keras Feature Map: 	[-0.105 -0.231 -0.111 -0.136 -0.194 -0.153 -0.03   0.011 -0.029 -0.018]
Ncnn Feature Map: 	[-0.003  0.07  -0.013 -0.109 -0.044  0.003  0.056  0.09   0.013  0.034]
activation_13_Split
==================================
Layer Name: global_average_pooling2d_4, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.583 ncnn->0.873 	Min: keras->-0.362 ncnn->-0.346
Mean: 	keras->0.038 ncnn->-0.006 	Var: keras->0.295 ncnn->0.195
Cosine Similarity: 0.55199
Keras Feature Map: 	[-0.201  0.241  0.006 -0.306 -0.046 -0.099 -0.101  0.141  1.04   0.025]
Ncnn Feature Map: 	[-0.022  0.486 -0.038 -0.272 -0.168 -0.101 -0.088  0.217 -0.163 -0.095]
Top-k:
Keras Top-k: 	213:1.583, 228:1.574, 191:1.236, 100:1.145, 8:1.040
ncnn Top-k: 	194:0.873, 191:0.695, 186:0.686, 126:0.660, 88:0.628
==================================
Layer Name: dense_7, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->3.623 ncnn->2.651 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.333 ncnn->0.316 	Var: keras->0.704 ncnn->0.558
Cosine Similarity: 0.27670
Keras Feature Map: 	[0.    0.391 0.    0.    0.    0.    2.551 0.    0.258 1.287]
Ncnn Feature Map: 	[0.    1.058 0.    0.    0.179 0.119 2.651 0.    1.422 0.   ]
Top-k:
Keras Top-k: 	42:3.623, 50:3.035, 25:2.848, 127:2.838, 177:2.741
ncnn Top-k: 	6:2.651, 50:2.445, 92:2.266, 207:2.172, 127:1.929
==================================
Layer Name: dense_8, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.000 ncnn->14.569 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.621 ncnn->3.205 	Var: keras->0.419 ncnn->3.472
Cosine Similarity: 0.13461
Keras Feature Map: 	[0.275 1.    1.    1.    0.    0.    1.    0.925 1.    0.   ]
Ncnn Feature Map: 	[2.622 6.717 6.614 3.069 0.    0.    5.97  6.509 4.726 0.   ]
Top-k:
Keras Top-k: 	239:1.000, 113:1.000, 107:1.000, 104:1.000, 101:1.000
ncnn Top-k: 	18:14.569, 154:14.136, 70:12.382, 180:12.123, 169:11.698
dense_8_HardSigmoid
==================================
Layer Name: reshape_4, Layer Shape: keras->(1, 1, 1, 240) ncnn->(1, 240, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.455
Mean: 	keras->0.621 ncnn->0.756 	Var: keras->0.419 ncnn->0.254
Cosine Similarity: 0.05990
Keras Feature Map: 	[0.275]
Ncnn Feature Map: 	[0.931 1.    1.    1.    0.455 0.455 1.    1.    1.    0.455]
==================================
Layer Name: multiply_4, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->4.369 ncnn->6.026 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.049 ncnn->-0.002 	Var: keras->0.337 ncnn->0.233
Cosine Similarity: 0.80338
Keras Feature Map: 	[-0.029 -0.063 -0.03  -0.037 -0.053 -0.042 -0.008  0.003 -0.008 -0.005]
Ncnn Feature Map: 	[-0.003  0.07  -0.013 -0.101 -0.02   0.002  0.056  0.041  0.006  0.034]
==================================
Layer Name: conv2d_13, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->4.176 ncnn->3.487 	Min: keras->-4.651 ncnn->-2.963
Mean: 	keras->0.083 ncnn->-0.175 	Var: keras->1.220 ncnn->0.618
Cosine Similarity: 0.90713
Keras Feature Map: 	[-0.774 -0.85  -0.52  -0.661 -0.649 -0.72  -1.355 -0.982 -0.45  -0.273]
Ncnn Feature Map: 	[-0.679 -0.581 -0.328 -0.294 -0.42  -0.568 -0.454  0.173  0.084  0.212]
==================================
Layer Name: batch_normalization_19, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->4.505 ncnn->3.383 	Min: keras->-4.563 ncnn->-3.824
Mean: 	keras->0.061 ncnn->-0.202 	Var: keras->1.175 ncnn->0.674
Cosine Similarity: 1.01763
Keras Feature Map: 	[-1.003 -1.077 -0.76  -0.895 -0.884 -0.952 -1.562 -1.204 -0.693 -0.522]
Ncnn Feature Map: 	[-0.912 -0.818 -0.575 -0.543 -0.664 -0.805 -0.696 -0.094 -0.179 -0.057]
==================================
Layer Name: add_3, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->7.882 ncnn->3.635 	Min: keras->-5.970 ncnn->-5.870
Mean: 	keras->0.118 ncnn->-0.048 	Var: keras->1.745 ncnn->1.100
Cosine Similarity: 0.82925
Keras Feature Map: 	[1.553 0.438 0.332 0.535 0.307 0.518 0.117 0.631 1.322 1.994]
Ncnn Feature Map: 	[ 0.685  0.65   0.232 -0.006  0.102 -0.227 -0.056  1.496  0.648  0.721]
==================================
Layer Name: conv2d_14, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->9.958 ncnn->6.795 	Min: keras->-7.671 ncnn->-7.571
Mean: 	keras->-0.272 ncnn->0.039 	Var: keras->2.322 ncnn->1.481
Cosine Similarity: 0.77032
Keras Feature Map: 	[-4.425 -0.283  0.195  0.222  0.584  0.037  0.234  0.584  1.577  3.714]
Ncnn Feature Map: 	[-4.143 -3.368  0.399  2.726  2.642  2.718  3.536  2.621  1.371  1.905]
==================================
Layer Name: batch_normalization_20, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->4.091 ncnn->2.551 	Min: keras->-4.205 ncnn->-3.387
Mean: 	keras->-0.268 ncnn->-0.139 	Var: keras->1.041 ncnn->0.656
Cosine Similarity: 0.69716
Keras Feature Map: 	[-1.385 -0.276 -0.148 -0.141 -0.044 -0.19  -0.137 -0.044  0.222  0.794]
Ncnn Feature Map: 	[-1.309 -1.102 -0.093  0.53   0.508  0.528  0.747  0.502  0.167  0.31 ]
==================================
Layer Name: activation_14, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->4.091 ncnn->2.360 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.058 ncnn->0.005 	Var: keras->0.504 ncnn->0.311
Cosine Similarity: 0.77132
Keras Feature Map: 	[-0.373 -0.125 -0.07  -0.067 -0.021 -0.089 -0.065 -0.021  0.119  0.502]
Ncnn Feature Map: 	[-0.369 -0.349 -0.045  0.312  0.297  0.31   0.466  0.293  0.088  0.171]
==================================
Layer Name: depthwise_conv2d_7, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->3.342 ncnn->1.897 	Min: keras->-4.673 ncnn->-1.738
Mean: 	keras->-0.039 ncnn->-0.016 	Var: keras->0.440 ncnn->0.262
Cosine Similarity: 0.88389
Keras Feature Map: 	[-0.106 -0.019 -0.066 -0.067 -0.05  -0.038  0.004  0.026  0.088  0.179]
Ncnn Feature Map: 	[-0.114 -0.094 -0.075  0.006  0.043  0.092  0.127  0.034  0.045  0.06 ]
==================================
Layer Name: batch_normalization_21, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->5.117 ncnn->2.594 	Min: keras->-5.029 ncnn->-3.356
Mean: 	keras->-0.125 ncnn->-0.100 	Var: keras->0.819 ncnn->0.558
Cosine Similarity: 0.73467
Keras Feature Map: 	[-0.455 -0.165 -0.322 -0.323 -0.266 -0.228 -0.087 -0.013  0.193  0.498]
Ncnn Feature Map: 	[-0.483 -0.416 -0.351 -0.079  0.043  0.207  0.326  0.014  0.051  0.101]
==================================
Layer Name: activation_15, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->5.117 ncnn->2.418 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.049 ncnn->0.004 	Var: keras->0.452 ncnn->0.276
Cosine Similarity: 0.73139
Keras Feature Map: 	[-0.193 -0.078 -0.144 -0.144 -0.121 -0.105 -0.042 -0.006  0.103  0.29 ]
Ncnn Feature Map: 	[-0.202 -0.179 -0.155 -0.038  0.022  0.111  0.18   0.007  0.026  0.052]
activation_15_Split
==================================
Layer Name: global_average_pooling2d_5, Layer Shape: keras->(1, 120) ncnn->(1, 1, 120)
Max: 	keras->1.466 ncnn->0.839 	Min: keras->-0.333 ncnn->-0.297
Mean: 	keras->0.049 ncnn->0.004 	Var: keras->0.320 ncnn->0.180
Cosine Similarity: 0.64519
Keras Feature Map: 	[-0.002  0.26  -0.19   0.438 -0.068 -0.064 -0.163 -0.129 -0.124 -0.002]
Ncnn Feature Map: 	[ 0.199 -0.085  0.251  0.272 -0.085  0.017  0.129 -0.139 -0.128  0.118]
Top-k:
Keras Top-k: 	104:1.466, 39:1.301, 36:1.151, 46:1.072, 61:0.882
ncnn Top-k: 	46:0.839, 115:0.588, 83:0.557, 19:0.483, 77:0.429
==================================
Layer Name: dense_9, Layer Shape: keras->(1, 120) ncnn->(1, 1, 120)
Max: 	keras->2.904 ncnn->1.961 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.320 ncnn->0.287 	Var: keras->0.598 ncnn->0.481
Cosine Similarity: 0.24184
Keras Feature Map: 	[1.13  1.226 0.    0.19  0.    0.167 0.639 0.905 0.    0.   ]
Ncnn Feature Map: 	[0.038 1.51  0.    0.346 0.071 0.    0.    0.471 0.    0.088]
Top-k:
Keras Top-k: 	46:2.904, 62:2.601, 114:2.218, 61:1.815, 13:1.787
ncnn Top-k: 	46:1.961, 37:1.700, 82:1.678, 25:1.574, 1:1.510
==================================
Layer Name: dense_10, Layer Shape: keras->(1, 120) ncnn->(1, 1, 120)
Max: 	keras->1.000 ncnn->6.578 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.717 ncnn->2.052 	Var: keras->0.321 ncnn->1.743
Cosine Similarity: 0.10127
Keras Feature Map: 	[0.799 0.902 0.785 1.    0.87  1.    0.246 0.883 0.248 0.041]
Ncnn Feature Map: 	[2.005 1.734 1.943 3.985 2.125 4.672 0.538 1.71  0.136 0.   ]
Top-k:
Keras Top-k: 	119:1.000, 74:1.000, 87:1.000, 26:1.000, 27:1.000
ncnn Top-k: 	70:6.578, 33:6.525, 62:5.940, 81:5.653, 69:5.213
dense_10_HardSigmoid
==================================
Layer Name: reshape_5, Layer Shape: keras->(1, 1, 1, 120) ncnn->(1, 120, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.455
Mean: 	keras->0.717 ncnn->0.759 	Var: keras->0.321 ncnn->0.219
Cosine Similarity: 0.02838
Keras Feature Map: 	[0.799]
Ncnn Feature Map: 	[0.819 0.77  0.808 1.    0.841 1.    0.552 0.765 0.479 0.455]
==================================
Layer Name: multiply_5, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->4.772 ncnn->1.407 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.051 ncnn->0.009 	Var: keras->0.383 ncnn->0.188
Cosine Similarity: 0.70164
Keras Feature Map: 	[-0.154 -0.062 -0.115 -0.115 -0.097 -0.084 -0.034 -0.005  0.082  0.232]
Ncnn Feature Map: 	[-0.166 -0.086 -0.155 -0.038  0.017  0.09   0.18   0.007  0.012  0.024]
==================================
Layer Name: conv2d_15, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->3.744 ncnn->1.577 	Min: keras->-4.028 ncnn->-1.658
Mean: 	keras->-0.240 ncnn->-0.050 	Var: keras->0.997 ncnn->0.411
Cosine Similarity: 0.63543
Keras Feature Map: 	[-0.653  0.599  0.707  0.53   0.459  0.336  0.434  0.388  0.45   0.542]
Ncnn Feature Map: 	[-0.767 -0.338 -0.047  0.164 -0.084 -0.083  0.224 -0.088  0.127  0.169]
==================================
Layer Name: batch_normalization_22, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->4.949 ncnn->2.177 	Min: keras->-4.603 ncnn->-2.502
Mean: 	keras->-0.269 ncnn->-0.001 	Var: keras->1.286 ncnn->0.565
Cosine Similarity: 0.73296
Keras Feature Map: 	[-0.844  0.691  0.823  0.606  0.519  0.369  0.488  0.433  0.508  0.621]
Ncnn Feature Map: 	[-0.983 -0.458 -0.1    0.158 -0.147 -0.146  0.232 -0.151  0.112  0.164]
batch_normalization_22_Split
==================================
Layer Name: conv2d_16, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->7.216 ncnn->2.799 	Min: keras->-7.883 ncnn->-3.050
Mean: 	keras->-0.464 ncnn->0.055 	Var: keras->1.972 ncnn->0.844
Cosine Similarity: 0.72081
Keras Feature Map: 	[0.041 0.706 0.101 1.03  0.953 0.232 0.588 0.332 0.144 0.45 ]
Ncnn Feature Map: 	[ 0.025 -0.581  0.952  1.346  1.579  1.486  2.121  1.154  1.379  1.082]
==================================
Layer Name: batch_normalization_23, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.700 ncnn->2.439 	Min: keras->-7.264 ncnn->-2.575
Mean: 	keras->-0.520 ncnn->-0.153 	Var: keras->1.427 ncnn->0.639
Cosine Similarity: 0.62868
Keras Feature Map: 	[0.068 0.378 0.096 0.529 0.493 0.157 0.323 0.203 0.116 0.259]
Ncnn Feature Map: 	[ 0.06  -0.223  0.493  0.677  0.786  0.742  1.038  0.587  0.692  0.554]
==================================
Layer Name: activation_16, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.700 ncnn->2.211 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.096 ncnn->-0.005 	Var: keras->0.688 ncnn->0.295
Cosine Similarity: 0.72011
Keras Feature Map: 	[0.035 0.213 0.049 0.311 0.287 0.083 0.179 0.109 0.06  0.141]
Ncnn Feature Map: 	[ 0.031 -0.103  0.287  0.415  0.496  0.463  0.699  0.351  0.426  0.328]
==================================
Layer Name: depthwise_conv2d_8, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.010 ncnn->1.443 	Min: keras->-4.993 ncnn->-1.765
Mean: 	keras->-0.087 ncnn->-0.013 	Var: keras->0.608 ncnn->0.243
Cosine Similarity: 0.72001
Keras Feature Map: 	[0.024 0.044 0.05  0.098 0.105 0.065 0.044 0.026 0.011 0.022]
Ncnn Feature Map: 	[0.021 0.015 0.02  0.01  0.013 0.04  0.074 0.057 0.06  0.051]
==================================
Layer Name: batch_normalization_24, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->10.067 ncnn->1.958 	Min: keras->-11.531 ncnn->-2.862
Mean: 	keras->-0.265 ncnn->-0.151 	Var: keras->1.153 ncnn->0.506
Cosine Similarity: 0.68785
Keras Feature Map: 	[0.388 0.443 0.458 0.59  0.61  0.501 0.441 0.392 0.353 0.382]
Ncnn Feature Map: 	[0.378 0.363 0.376 0.348 0.358 0.432 0.525 0.479 0.487 0.462]
==================================
Layer Name: activation_17, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->10.067 ncnn->1.618 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.047 ncnn->-0.029 	Var: keras->0.564 ncnn->0.237
Cosine Similarity: 0.72575
Keras Feature Map: 	[0.219 0.254 0.264 0.353 0.367 0.292 0.253 0.222 0.197 0.215]
Ncnn Feature Map: 	[0.213 0.203 0.212 0.194 0.2   0.247 0.308 0.278 0.283 0.266]
activation_17_Split
==================================
Layer Name: global_average_pooling2d_6, Layer Shape: keras->(1, 144) ncnn->(1, 1, 144)
Max: 	keras->2.205 ncnn->0.611 	Min: keras->-0.340 ncnn->-0.323
Mean: 	keras->0.047 ncnn->-0.029 	Var: keras->0.323 ncnn->0.160
Cosine Similarity: 0.67975
Keras Feature Map: 	[-0.095 -0.136  0.127 -0.124 -0.034 -0.042  0.629  0.487  1.073 -0.075]
Ncnn Feature Map: 	[-0.174 -0.145 -0.013 -0.111 -0.111  0.033  0.472  0.598 -0.214 -0.159]
Top-k:
Keras Top-k: 	141:2.205, 70:1.097, 8:1.073, 111:0.933, 33:0.806
ncnn Top-k: 	109:0.611, 98:0.602, 7:0.598, 6:0.472, 18:0.338
==================================
Layer Name: dense_11, Layer Shape: keras->(1, 144) ncnn->(1, 1, 144)
Max: 	keras->2.647 ncnn->1.571 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.328 ncnn->0.227 	Var: keras->0.645 ncnn->0.363
Cosine Similarity: 0.30777
Keras Feature Map: 	[0.621 0.254 0.    0.    0.155 0.    0.    0.    0.    0.   ]
Ncnn Feature Map: 	[0.    0.    0.    0.124 0.    0.    0.169 0.    0.523 0.   ]
Top-k:
Keras Top-k: 	119:2.647, 142:2.574, 16:2.534, 132:2.391, 106:2.380
ncnn Top-k: 	80:1.571, 79:1.364, 65:1.298, 142:1.253, 15:1.218
==================================
Layer Name: dense_12, Layer Shape: keras->(1, 144) ncnn->(1, 1, 144)
Max: 	keras->1.000 ncnn->5.720 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.530 ncnn->1.190 	Var: keras->0.415 ncnn->1.431
Cosine Similarity: 0.13383
Keras Feature Map: 	[0.33  0.423 1.    1.    0.    1.    1.    0.    0.796 0.271]
Ncnn Feature Map: 	[0.    1.637 2.028 3.296 0.    0.926 3.339 0.    1.068 0.   ]
Top-k:
Keras Top-k: 	82:1.000, 120:1.000, 34:1.000, 117:1.000, 116:1.000
ncnn Top-k: 	21:5.720, 37:4.998, 117:4.953, 130:4.806, 52:4.545
dense_12_HardSigmoid
==================================
Layer Name: reshape_6, Layer Shape: keras->(1, 1, 1, 144) ncnn->(1, 144, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.455
Mean: 	keras->0.530 ncnn->0.645 	Var: keras->0.415 ncnn->0.209
Cosine Similarity: 0.09599
Keras Feature Map: 	[0.33]
Ncnn Feature Map: 	[0.455 0.752 0.823 1.    0.455 0.623 1.    0.455 0.649 0.455]
==================================
Layer Name: multiply_6, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.688 ncnn->1.315 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.032 ncnn->-0.013 	Var: keras->0.350 ncnn->0.143
Cosine Similarity: 0.79093
Keras Feature Map: 	[0.072 0.084 0.087 0.117 0.121 0.096 0.084 0.073 0.065 0.071]
Ncnn Feature Map: 	[0.097 0.132 0.096 0.1   0.091 0.247 0.14  0.126 0.283 0.263]
==================================
Layer Name: conv2d_17, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->4.038 ncnn->1.051 	Min: keras->-4.769 ncnn->-1.133
Mean: 	keras->-0.176 ncnn->0.019 	Var: keras->0.900 ncnn->0.288
Cosine Similarity: 0.85288
Keras Feature Map: 	[ 0.305  0.176 -0.256 -0.16  -0.497 -0.581 -0.501 -0.359 -0.058  0.113]
Ncnn Feature Map: 	[-0.118  0.01  -0.105  0.202 -0.237 -0.033  0.052  0.107  0.059  0.071]
==================================
Layer Name: batch_normalization_25, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->3.686 ncnn->1.442 	Min: keras->-4.459 ncnn->-1.364
Mean: 	keras->-0.207 ncnn->0.012 	Var: keras->1.059 ncnn->0.352
Cosine Similarity: 0.73962
Keras Feature Map: 	[-0.097 -0.21  -0.588 -0.504 -0.798 -0.872 -0.802 -0.677 -0.415 -0.265]
Ncnn Feature Map: 	[-0.467 -0.355 -0.456 -0.187 -0.571 -0.393 -0.318 -0.27  -0.312 -0.302]
==================================
Layer Name: add_4, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->7.120 ncnn->2.629 	Min: keras->-6.300 ncnn->-3.442
Mean: 	keras->-0.476 ncnn->0.011 	Var: keras->1.859 ncnn->0.706
Cosine Similarity: 0.77737
Keras Feature Map: 	[-0.942  0.481  0.235  0.102 -0.279 -0.503 -0.314 -0.245  0.093  0.356]
Ncnn Feature Map: 	[-1.45  -0.813 -0.556 -0.029 -0.718 -0.538 -0.086 -0.421 -0.2   -0.138]
add_4_Split
==================================
Layer Name: conv2d_18, Layer Shape: keras->(1, 10, 10, 288) ncnn->(288, 10, 10)
Max: 	keras->12.497 ncnn->4.404 	Min: keras->-12.407 ncnn->-3.363
Mean: 	keras->0.207 ncnn->0.107 	Var: keras->3.007 ncnn->0.926
Cosine Similarity: 0.83611
Keras Feature Map: 	[ 0.505  1.052  0.094  0.138  0.67  -0.059  0.083  0.104  0.547  0.749]
Ncnn Feature Map: 	[ 0.403 -0.728 -0.01   1.116  1.705  1.088  1.499  0.898  0.991  0.732]
==================================
Layer Name: batch_normalization_26, Layer Shape: keras->(1, 10, 10, 288) ncnn->(288, 10, 10)
Max: 	keras->7.410 ncnn->2.284 	Min: keras->-6.151 ncnn->-2.306
Mean: 	keras->0.014 ncnn->-0.056 	Var: keras->1.549 ncnn->0.482
Cosine Similarity: 0.74293
Keras Feature Map: 	[ 0.204  0.422  0.039  0.057  0.27  -0.022  0.035  0.043  0.221  0.301]
Ncnn Feature Map: 	[ 0.163 -0.289 -0.002  0.448  0.684  0.437  0.601  0.361  0.398  0.294]
==================================
Layer Name: activation_18, Layer Shape: keras->(1, 10, 10, 288) ncnn->(288, 10, 10)
Max: 	keras->7.410 ncnn->2.012 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.378 ncnn->0.011 	Var: keras->0.953 ncnn->0.247
Cosine Similarity: 0.76708
Keras Feature Map: 	[ 0.109  0.241  0.02   0.029  0.147 -0.011  0.018  0.022  0.118  0.166]
Ncnn Feature Map: 	[ 0.086 -0.131 -0.001  0.257  0.42   0.25   0.361  0.202  0.225  0.162]
==================================
Layer Name: depthwise_conv2d_9, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->6.978 ncnn->1.340 	Min: keras->-7.315 ncnn->-1.791
Mean: 	keras->-0.217 ncnn->0.001 	Var: keras->1.075 ncnn->0.233
Cosine Similarity: 0.80214
Keras Feature Map: 	[ 0.192 -0.179 -0.087 -0.08   0.14 ]
Ncnn Feature Map: 	[0.019 0.116 0.174 0.135 0.044]
==================================
Layer Name: batch_normalization_27, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->7.376 ncnn->1.759 	Min: keras->-12.403 ncnn->-3.588
Mean: 	keras->-0.373 ncnn->-0.057 	Var: keras->1.500 ncnn->0.508
Cosine Similarity: 0.91866
Keras Feature Map: 	[ 0.206 -0.723 -0.491 -0.475  0.076]
Ncnn Feature Map: 	[-0.226  0.016  0.161  0.063 -0.163]
==================================
Layer Name: activation_19, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->7.376 ncnn->1.395 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.097 ncnn->0.015 	Var: keras->0.704 ncnn->0.260
Cosine Similarity: 0.83538
Keras Feature Map: 	[ 0.11  -0.274 -0.205 -0.2    0.039]
Ncnn Feature Map: 	[-0.104  0.008  0.085  0.032 -0.077]
activation_19_Split
==================================
Layer Name: global_average_pooling2d_7, Layer Shape: keras->(1, 288) ncnn->(1, 1, 288)
Max: 	keras->3.451 ncnn->0.851 	Min: keras->-0.361 ncnn->-0.344
Mean: 	keras->0.097 ncnn->0.015 	Var: keras->0.491 ncnn->0.208
Cosine Similarity: 0.84747
Keras Feature Map: 	[ 0.059  0.506 -0.078  0.531 -0.091 -0.09   0.016 -0.289 -0.083 -0.189]
Ncnn Feature Map: 	[-0.054  0.206 -0.07   0.159  0.126 -0.033 -0.043  0.078 -0.074 -0.078]
Top-k:
Keras Top-k: 	87:3.451, 224:3.258, 81:2.399, 188:2.102, 91:2.073
ncnn Top-k: 	48:0.851, 162:0.847, 86:0.759, 227:0.671, 186:0.649
==================================
Layer Name: dense_13, Layer Shape: keras->(1, 288) ncnn->(1, 1, 288)
Max: 	keras->5.375 ncnn->3.895 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.633 ncnn->0.378 	Var: keras->1.280 ncnn->0.799
Cosine Similarity: 0.81293
Keras Feature Map: 	[5.129 0.    0.    0.    3.012 0.624 0.434 4.075 0.    0.033]
Ncnn Feature Map: 	[0.    0.    0.083 1.253 0.    0.    0.    0.021 0.    0.   ]
Top-k:
Keras Top-k: 	265:5.375, 17:5.367, 107:5.279, 35:5.271, 0:5.129
ncnn Top-k: 	225:3.895, 164:3.340, 102:3.273, 237:3.183, 11:3.071
==================================
Layer Name: dense_14, Layer Shape: keras->(1, 288) ncnn->(1, 1, 288)
Max: 	keras->1.000 ncnn->18.737 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.448 ncnn->3.445 	Var: keras->0.483 ncnn->4.162
Cosine Similarity: 0.40886
Keras Feature Map: 	[1. 0. 0. 1. 1. 0. 1. 0. 1. 1.]
Ncnn Feature Map: 	[ 0.     3.919  0.     0.    12.605  1.277  3.857  0.     0.     6.821]
Top-k:
Keras Top-k: 	0:1.000, 95:1.000, 110:1.000, 108:1.000, 107:1.000
ncnn Top-k: 	217:18.737, 143:16.101, 103:12.946, 267:12.940, 102:12.651
dense_14_HardSigmoid
==================================
Layer Name: reshape_7, Layer Shape: keras->(1, 1, 1, 288) ncnn->(1, 288, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.455
Mean: 	keras->0.448 ncnn->0.720 	Var: keras->0.483 ncnn->0.264
Cosine Similarity: 0.29666
Keras Feature Map: 	[1.]
Ncnn Feature Map: 	[0.455 1.    0.455 0.455 1.    0.687 1.    0.455 0.455 1.   ]
==================================
Layer Name: multiply_7, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->7.376 ncnn->1.395 	Min: keras->-0.375 ncnn->-0.374
Mean: 	keras->0.055 ncnn->0.014 	Var: keras->0.475 ncnn->0.196
Cosine Similarity: 0.83302
Keras Feature Map: 	[ 0.11  -0.274 -0.205 -0.2    0.039]
Ncnn Feature Map: 	[-0.047  0.004  0.039  0.032 -0.035]
==================================
Layer Name: conv2d_19, Layer Shape: keras->(1, 5, 5, 96) ncnn->(96, 5, 5)
Max: 	keras->6.916 ncnn->3.308 	Min: keras->-7.068 ncnn->-3.578
Mean: 	keras->-0.109 ncnn->0.072 	Var: keras->1.848 ncnn->1.167
Cosine Similarity: 1.03502
Keras Feature Map: 	[-0.74  -0.774 -1.223 -1.44  -1.283]
Ncnn Feature Map: 	[-0.023 -0.228 -1.567 -1.579 -0.822]
==================================
Layer Name: batch_normalization_28, Layer Shape: keras->(1, 5, 5, 96) ncnn->(96, 5, 5)
Max: 	keras->4.018 ncnn->1.549 	Min: keras->-4.341 ncnn->-1.406
Mean: 	keras->-0.049 ncnn->0.023 	Var: keras->0.970 ncnn->0.461
Cosine Similarity: 1.02942
Keras Feature Map: 	[-0.08  -0.094 -0.283 -0.374 -0.308]
Ncnn Feature Map: 	[ 0.222  0.135 -0.427 -0.433 -0.115]
==================================
Layer Name: conv_pw_mobile_bottleneck, Layer Shape: keras->(1, 5, 5, 32) ncnn->(32, 5, 5)
Max: 	keras->11.915 ncnn->3.663 	Min: keras->-8.413 ncnn->-5.933
Mean: 	keras->0.622 ncnn->-0.242 	Var: keras->3.560 ncnn->1.671
Cosine Similarity: 1.05588
Keras Feature Map: 	[-1.86  -4.268 -3.909 -3.923 -3.8  ]
Ncnn Feature Map: 	[-0.676 -1.214 -3.248 -3.464 -1.38 ]
==================================
Layer Name: up_sampling2d_1, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->11.915 ncnn->3.663 	Min: keras->-8.413 ncnn->-5.933
Mean: 	keras->0.622 ncnn->-0.242 	Var: keras->3.560 ncnn->1.671
Cosine Similarity: 1.05588
Keras Feature Map: 	[-1.86  -1.86  -4.268 -4.268 -3.909 -3.909 -3.923 -3.923 -3.8   -3.8  ]
Ncnn Feature Map: 	[-0.676 -0.676 -1.214 -1.214 -3.248 -3.248 -3.464 -3.464 -1.38  -1.38 ]
==================================
Layer Name: conv_pw_upblock_4, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->21.250 ncnn->5.602 	Min: keras->-21.531 ncnn->-4.950
Mean: 	keras->0.468 ncnn->0.195 	Var: keras->5.509 ncnn->1.530
Cosine Similarity: 0.68499
Keras Feature Map: 	[-1.381 -6.175 -7.394 -7.318 -6.163 -4.364 -5.653 -5.272 -4.099 -3.886]
Ncnn Feature Map: 	[ 0.291  1.499 -0.667 -2.247 -2.08  -1.584 -1.294 -1.259 -1.313 -0.58 ]
==================================
Layer Name: add_5, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->30.777 ncnn->8.176 	Min: keras->-29.944 ncnn->-8.920
Mean: 	keras->1.090 ncnn->-0.047 	Var: keras->8.069 ncnn->2.526
Cosine Similarity: 0.83700
Keras Feature Map: 	[ -3.241  -8.035 -11.661 -11.586 -10.072  -8.273  -9.576  -9.195  -7.899
  -7.686]
Ncnn Feature Map: 	[-0.385  0.822 -1.881 -3.461 -5.328 -4.832 -4.757 -4.722 -2.693 -1.96 ]
sep_upblock_4b_dw
==================================
Layer Name: sep_upblock_4b, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->29.722 ncnn->9.039 	Min: keras->-32.109 ncnn->-7.338
Mean: 	keras->1.375 ncnn->0.400 	Var: keras->10.522 ncnn->2.494
Cosine Similarity: 1.08785
Keras Feature Map: 	[ 2.262  0.189 -2.102 -0.867 -1.894 -1.515 -1.272 -0.599 -1.433  1.563]
Ncnn Feature Map: 	[-0.46  -0.547 -0.445 -0.456 -1.088 -0.366 -0.905 -0.691 -1.06  -0.303]
==================================
Layer Name: batch_normalization_29, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->5.389 ncnn->nan 	Min: keras->-3.505 ncnn->nan
Mean: 	keras->0.564 ncnn->nan 	Var: keras->1.728 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.52   0.117 -0.328 -0.088 -0.287 -0.214 -0.166 -0.036 -0.198  0.384]
Ncnn Feature Map: 	[-0.481 -0.54  -0.471 -0.478 -0.906 -0.418 -0.782 -0.637 -0.888 -0.375]
==================================
Layer Name: leaky_re_lu_1, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->5.389 ncnn->nan 	Min: keras->-1.051 ncnn->nan
Mean: 	keras->0.888 ncnn->nan 	Var: keras->1.363 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.52   0.117 -0.098 -0.026 -0.086 -0.064 -0.05  -0.011 -0.059  0.384]
Ncnn Feature Map: 	[-0.144 -0.162 -0.141 -0.143 -0.272 -0.125 -0.235 -0.191 -0.266 -0.112]
==================================
Layer Name: up_sampling2d_2, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->5.389 ncnn->nan 	Min: keras->-1.051 ncnn->nan
Mean: 	keras->0.888 ncnn->nan 	Var: keras->1.363 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.52   0.52   0.117  0.117 -0.098 -0.098 -0.026 -0.026 -0.086 -0.086]
Ncnn Feature Map: 	[-0.144 -0.144 -0.162 -0.162 -0.141 -0.141 -0.143 -0.143 -0.272 -0.272]
==================================
Layer Name: conv_pw_upblock_3, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->5.566 ncnn->3.624 	Min: keras->-3.908 ncnn->-3.389
Mean: 	keras->0.097 ncnn->0.078 	Var: keras->0.955 ncnn->0.744
Cosine Similarity: 1.07056
Keras Feature Map: 	[-0.962 -1.737 -1.694  0.301 -0.486 -0.831 -1.39  -1.486 -1.964 -1.47 ]
Ncnn Feature Map: 	[-0.466  0.869  0.165  0.165  0.165  0.165  0.165  0.165  0.165  0.165]
==================================
Layer Name: add_6, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->6.228 ncnn->nan 	Min: keras->-3.372 ncnn->nan
Mean: 	keras->0.985 ncnn->nan 	Var: keras->1.653 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-0.442 -1.217 -1.576  0.418 -0.584 -0.93  -1.416 -1.512 -2.05  -1.556]
Ncnn Feature Map: 	[-0.61   0.724  0.003  0.003  0.024  0.024  0.022  0.022 -0.107 -0.107]
sep_upblock_3b_dw
==================================
Layer Name: sep_upblock_3b, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->9.169 ncnn->nan 	Min: keras->-8.391 ncnn->nan
Mean: 	keras->-0.042 ncnn->nan 	Var: keras->2.755 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-0.64  -1.078 -0.728 -0.442 -0.555 -0.791 -1.027 -0.981 -1.432 -1.23 ]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: batch_normalization_30, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->4.921 ncnn->nan 	Min: keras->-3.521 ncnn->nan
Mean: 	keras->0.309 ncnn->nan 	Var: keras->1.435 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-0.427 -0.697 -0.481 -0.305 -0.374 -0.52  -0.666 -0.638 -0.916 -0.792]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: leaky_re_lu_2, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->4.921 ncnn->nan 	Min: keras->-1.056 ncnn->nan
Mean: 	keras->0.619 ncnn->nan 	Var: keras->1.085 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-0.128 -0.209 -0.144 -0.091 -0.112 -0.156 -0.2   -0.191 -0.275 -0.237]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: up_sampling2d_3, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->4.921 ncnn->nan 	Min: keras->-1.056 ncnn->nan
Mean: 	keras->0.619 ncnn->nan 	Var: keras->1.085 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-0.128 -0.128 -0.209 -0.209 -0.144 -0.144 -0.091 -0.091 -0.112 -0.112]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: conv_pw_upblock_2, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->3.356 ncnn->5.342 	Min: keras->-3.982 ncnn->-6.215
Mean: 	keras->-0.063 ncnn->0.075 	Var: keras->0.891 ncnn->0.692
Cosine Similarity: 1.01811
Keras Feature Map: 	[-1.227 -1.509 -1.472 -0.858 -2.254 -2.312 -0.813 -2.581 -1.463 -0.964]
Ncnn Feature Map: 	[ 0.923 -0.251 -0.075 -0.075 -0.075 -0.075 -0.075 -0.075 -0.075 -0.075]
==================================
Layer Name: add_7, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->5.888 ncnn->nan 	Min: keras->-3.403 ncnn->nan
Mean: 	keras->0.557 ncnn->nan 	Var: keras->1.405 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-1.355 -1.637 -1.682 -1.067 -2.398 -2.457 -0.905 -2.672 -1.575 -1.076]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
sep_upblock_2b_dw
==================================
Layer Name: sep_upblock_2b, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->9.756 ncnn->nan 	Min: keras->-10.530 ncnn->nan
Mean: 	keras->-0.585 ncnn->nan 	Var: keras->2.981 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[0.937 0.676 0.4   0.494 0.027 0.212 0.549 0.259 0.569 0.098]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: batch_normalization_31, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->4.175 ncnn->nan 	Min: keras->-3.787 ncnn->nan
Mean: 	keras->-0.035 ncnn->nan 	Var: keras->1.239 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[1.164 1.046 0.921 0.964 0.753 0.836 0.989 0.857 0.998 0.784]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: leaky_re_lu_3, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->4.175 ncnn->nan 	Min: keras->-1.136 ncnn->nan
Mean: 	keras->0.337 ncnn->nan 	Var: keras->0.877 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[1.164 1.046 0.921 0.964 0.753 0.836 0.989 0.857 0.998 0.784]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: up_sampling2d_4, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->4.175 ncnn->nan 	Min: keras->-1.136 ncnn->nan
Mean: 	keras->0.337 ncnn->nan 	Var: keras->0.877 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[1.164 1.164 1.046 1.046 0.921 0.921 0.964 0.964 0.753 0.753]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: conv_pw_upblock_1, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->7.299 ncnn->6.958 	Min: keras->-9.117 ncnn->-6.699
Mean: 	keras->-0.319 ncnn->0.108 	Var: keras->1.322 ncnn->1.077
Cosine Similarity: 0.95647
Keras Feature Map: 	[-1.013 -3.41  -1.978 -1.793 -1.022 -1.154 -1.294 -2.431 -1.356 -3.239]
Ncnn Feature Map: 	[ 0.322 -1.723 -0.847  0.065  0.495  0.835  0.461 -0.735  0.593 -0.358]
==================================
Layer Name: add_8, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->9.493 ncnn->nan 	Min: keras->-8.234 ncnn->nan
Mean: 	keras->0.017 ncnn->nan 	Var: keras->1.594 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.151 -2.246 -0.932 -0.747 -0.101 -0.233 -0.331 -1.467 -0.604 -2.486]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
sep_upblock_1b_dw
==================================
Layer Name: sep_upblock_1b, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->6.637 ncnn->nan 	Min: keras->-10.818 ncnn->nan
Mean: 	keras->-2.007 ncnn->nan 	Var: keras->2.186 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[-0.277 -2.182 -2.387 -1.413 -0.582 -0.511 -0.805 -1.044 -0.955 -1.492]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: batch_normalization_32, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->3.452 ncnn->nan 	Min: keras->-5.299 ncnn->nan
Mean: 	keras->-0.513 ncnn->nan 	Var: keras->1.019 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.19  -0.831 -0.941 -0.419  0.027  0.064 -0.093 -0.221 -0.173 -0.461]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: leaky_re_lu_4, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->3.452 ncnn->nan 	Min: keras->-1.590 ncnn->nan
Mean: 	keras->-0.022 ncnn->nan 	Var: keras->0.543 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.19  -0.249 -0.282 -0.126  0.027  0.064 -0.028 -0.066 -0.052 -0.138]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: up_sampling2d_5, Layer Shape: keras->(1, 160, 160, 32) ncnn->(32, 160, 160)
Max: 	keras->3.452 ncnn->nan 	Min: keras->-1.590 ncnn->nan
Mean: 	keras->-0.022 ncnn->nan 	Var: keras->0.543 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[ 0.19   0.19  -0.249 -0.249 -0.282 -0.282 -0.126 -0.126  0.027  0.027]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
==================================
Layer Name: final_layer, Layer Shape: keras->(1, 160, 160, 2) ncnn->(2, 160, 160)
Max: 	keras->0.998 ncnn->nan 	Min: keras->0.002 ncnn->nan
Mean: 	keras->0.500 ncnn->nan 	Var: keras->0.369 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[0.029 0.029 0.103 0.103 0.113 0.113 0.181 0.181 0.209 0.209]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
final_layer_Softmax
==================================
Layer Name: final_layer_Softmax, Layer Shape: keras->(1, 160, 160, 2) ncnn->(2, 160, 160)
Max: 	keras->0.998 ncnn->nan 	Min: keras->0.002 ncnn->nan
Mean: 	keras->0.500 ncnn->nan 	Var: keras->0.369 ncnn->nan
Cosine Similarity: nan
Keras Feature Map: 	[0.029 0.029 0.103 0.103 0.113 0.113 0.181 0.181 0.209 0.209]
Ncnn Feature Map: 	[nan nan nan nan nan nan nan nan nan nan]
I find a bug in emitting the param file. After fix that, they are still not the same, I think the reshape is causing the issue. Still working on :p
请问还更吗?
---Original--- From: "Martin @.> Date: Tue, Apr 13, 2021 16:18 PM To: @.>; Cc: @.@.>; Subject: Re: [MarsTechHAN/keras2ncnn] Activation type _hard_swish is is not supported yet. (#28)
I find a bug in emitting the param file. After fix that, they are still not the same, I think the reshape is causing the issue. Still working on :p
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
额,应该明天能修好吧,这两天稍微有点忙
催更
Got too busy those days. It turns out to be an easy fix. 038e61c I was able to get the correct result for a random input, can you check with your code?
==================================
Layer Name: conv2d_1, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->1.349 ncnn->1.349 	Min: keras->-1.747 ncnn->-1.747
Mean: 	keras->-0.079 ncnn->-0.079 	Var: keras->0.368 ncnn->0.368
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.273 0.527 0.297 0.531 0.457 0.368 0.561 0.567 0.475 0.746]
Ncnn Feature Map: 	[0.273 0.527 0.297 0.531 0.457 0.368 0.561 0.567 0.475 0.746]
==================================
Layer Name: batch_normalization_1, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->8.696 ncnn->8.696 	Min: keras->-7.951 ncnn->-7.951
Mean: 	keras->0.137 ncnn->0.137 	Var: keras->1.536 ncnn->1.536
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.469  0.387 -0.39   0.403  0.151 -0.149  0.505  0.525  0.212  1.129]
Ncnn Feature Map: 	[-0.469  0.387 -0.39   0.403  0.151 -0.149  0.505  0.525  0.212  1.129]
==================================
Layer Name: activation_1, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->8.696 ncnn->8.696 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.430 ncnn->0.430 	Var: keras->0.991 ncnn->0.991
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.198  0.218 -0.17   0.228  0.08  -0.071  0.295  0.308  0.113  0.777]
Ncnn Feature Map: 	[-0.198  0.218 -0.17   0.228  0.08  -0.071  0.295  0.308  0.113  0.777]
==================================
Layer Name: conv2d_2, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->12.417 ncnn->12.417 	Min: keras->-10.223 ncnn->-10.223
Mean: 	keras->-0.244 ncnn->-0.244 	Var: keras->1.508 ncnn->1.508
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.838 -0.088  0.228 -0.255  0.213 -0.169  0.52  -2.377 -4.959  1.411]
Ncnn Feature Map: 	[-0.838 -0.088  0.228 -0.255  0.213 -0.169  0.52  -2.377 -4.959  1.411]
==================================
Layer Name: batch_normalization_2, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->11.669 ncnn->11.669 	Min: keras->-11.529 ncnn->-11.529
Mean: 	keras->0.005 ncnn->0.005 	Var: keras->1.396 ncnn->1.396
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.257  0.367  0.631  0.228  0.618  0.301  0.874 -1.538 -3.688  1.616]
Ncnn Feature Map: 	[-0.257  0.367  0.631  0.228  0.618  0.301  0.874 -1.538 -3.688  1.616]
activation_2_Clip
==================================
Layer Name: activation_2, Layer Shape: keras->(1, 80, 80, 16) ncnn->(16, 80, 80)
Max: 	keras->6.000 ncnn->6.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.507 ncnn->0.507 	Var: keras->0.796 ncnn->0.796
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.    0.367 0.631 0.228 0.618 0.301 0.874 0.    0.    1.616]
Ncnn Feature Map: 	[0.    0.367 0.631 0.228 0.618 0.301 0.874 0.    0.    1.616]
activation_2_Split
==================================
Layer Name: depthwise_conv2d_1, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->3.493 ncnn->3.493 	Min: keras->-5.866 ncnn->-5.866
Mean: 	keras->0.005 ncnn->0.005 	Var: keras->0.597 ncnn->0.597
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.043 -0.006 -0.226 -0.068  0.132  0.374 -0.162  0.171 -0.122  0.341]
Ncnn Feature Map: 	[ 0.043 -0.006 -0.226 -0.068  0.132  0.374 -0.162  0.171 -0.122  0.341]
==================================
Layer Name: batch_normalization_3, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->15.202 ncnn->15.202 	Min: keras->-14.693 ncnn->-14.693
Mean: 	keras->0.299 ncnn->0.299 	Var: keras->2.209 ncnn->2.209
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.325 -0.415 -3.737 -1.346  1.681  5.342 -2.774  2.269 -2.157  4.837]
Ncnn Feature Map: 	[ 0.325 -0.415 -3.737 -1.346  1.681  5.342 -2.774  2.269 -2.157  4.837]
activation_3_Clip
==================================
Layer Name: activation_3, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->6.000 ncnn->6.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.908 ncnn->0.908 	Var: keras->1.304 ncnn->1.304
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.325 0.    0.    0.    1.681 5.342 0.    2.269 0.    4.837]
Ncnn Feature Map: 	[0.325 0.    0.    0.    1.681 5.342 0.    2.269 0.    4.837]
activation_3_Split
==================================
Layer Name: global_average_pooling2d_1, Layer Shape: keras->(1, 16) ncnn->(1, 1, 16)
Max: 	keras->1.879 ncnn->1.879 	Min: keras->0.187 ncnn->0.187
Mean: 	keras->0.908 ncnn->0.908 	Var: keras->0.483 ncnn->0.483
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.304 1.034 1.778 0.74  0.301 0.97  0.187 1.34  0.359 0.825]
Ncnn Feature Map: 	[1.304 1.034 1.778 0.74  0.301 0.97  0.187 1.34  0.359 0.825]
Top-k:
Keras Top-k: 	13:1.879, 2:1.778, 7:1.340, 0:1.304, 10:1.040
ncnn Top-k: 	13:1.879, 2:1.778, 7:1.340, 0:1.304, 10:1.040
==================================
Layer Name: dense_1, Layer Shape: keras->(1, 16) ncnn->(1, 1, 16)
Max: 	keras->2.866 ncnn->2.866 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->1.122 ncnn->1.122 	Var: keras->1.096 ncnn->1.096
Cosine Similarity: 0.00000
Keras Feature Map: 	[2.866 2.04  1.511 0.    1.259 1.065 0.    2.178 2.643 2.755]
Ncnn Feature Map: 	[2.866 2.04  1.511 0.    1.259 1.065 0.    2.178 2.643 2.755]
Top-k:
Keras Top-k: 	0:2.866, 9:2.755, 8:2.643, 7:2.178, 1:2.040
ncnn Top-k: 	0:2.866, 9:2.755, 8:2.643, 7:2.178, 1:2.040
==================================
Layer Name: dense_2, Layer Shape: keras->(1, 16) ncnn->(1, 1, 16)
Max: 	keras->0.444 ncnn->-0.278 	Min: keras->0.000 ncnn->-5.748
Mean: 	keras->0.184 ncnn->-1.846 	Var: keras->0.140 ncnn->1.280
Cosine Similarity: 1.37258
Keras Feature Map: 	[0.    0.248 0.169 0.063 0.444 0.231 0.045 0.    0.247 0.255]
Ncnn Feature Map: 	[-3.545 -1.258 -1.654 -2.183 -0.278 -1.346 -2.274 -5.748 -1.263 -1.226]
Top-k:
Keras Top-k: 	4:0.444, 12:0.437, 10:0.358, 9:0.255, 1:0.248
ncnn Top-k: 	4:-0.278, 12:-0.316, 10:-0.709, 9:-1.226, 1:-1.258
dense_2_HardSigmoid
==================================
Layer Name: reshape_1, Layer Shape: keras->(1, 1, 1, 16) ncnn->(16, 1, 1)
Max: 	keras->0.444 ncnn->0.444 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.184 ncnn->0.184 	Var: keras->0.140 ncnn->0.140
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.]
Ncnn Feature Map: 	[0.]
==================================
Layer Name: multiply_1, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->1.491 ncnn->1.491 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.147 ncnn->0.147 	Var: keras->0.231 ncnn->0.231
Cosine Similarity: 0.00000
Keras Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Ncnn Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
==================================
Layer Name: conv2d_3, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->1.563 ncnn->1.563 	Min: keras->-1.326 ncnn->-1.326
Mean: 	keras->-0.054 ncnn->-0.054 	Var: keras->0.331 ncnn->0.331
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.085 -0.094 -0.159 -0.157 -0.55  -0.474 -0.494 -0.339  0.021 -0.043]
Ncnn Feature Map: 	[-0.085 -0.094 -0.159 -0.157 -0.55  -0.474 -0.494 -0.339  0.021 -0.043]
==================================
Layer Name: batch_normalization_4, Layer Shape: keras->(1, 40, 40, 16) ncnn->(16, 40, 40)
Max: 	keras->3.067 ncnn->3.067 	Min: keras->-2.500 ncnn->-2.500
Mean: 	keras->0.088 ncnn->0.088 	Var: keras->0.599 ncnn->0.599
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.572  0.552  0.397  0.402 -0.529 -0.35  -0.396 -0.03   0.824  0.674]
Ncnn Feature Map: 	[ 0.572  0.552  0.397  0.402 -0.529 -0.35  -0.396 -0.03   0.824  0.674]
==================================
Layer Name: conv2d_4, Layer Shape: keras->(1, 40, 40, 72) ncnn->(72, 40, 40)
Max: 	keras->3.235 ncnn->3.235 	Min: keras->-3.489 ncnn->-3.489
Mean: 	keras->0.042 ncnn->0.042 	Var: keras->0.595 ncnn->0.595
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.18   0.419 -0.184  0.568 -0.607  0.934 -0.008  0.865  0.367 -0.19 ]
Ncnn Feature Map: 	[ 0.18   0.419 -0.184  0.568 -0.607  0.934 -0.008  0.865  0.367 -0.19 ]
==================================
Layer Name: batch_normalization_5, Layer Shape: keras->(1, 40, 40, 72) ncnn->(72, 40, 40)
Max: 	keras->2.372 ncnn->2.372 	Min: keras->-3.108 ncnn->-3.108
Mean: 	keras->0.051 ncnn->0.051 	Var: keras->0.469 ncnn->0.469
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.068  0.1   -0.324  0.205 -0.622  0.462 -0.2    0.414  0.063 -0.329]
Ncnn Feature Map: 	[-0.068  0.1   -0.324  0.205 -0.622  0.462 -0.2    0.414  0.063 -0.329]
activation_4_Clip
==================================
Layer Name: activation_4, Layer Shape: keras->(1, 40, 40, 72) ncnn->(72, 40, 40)
Max: 	keras->2.372 ncnn->2.372 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.209 ncnn->0.209 	Var: keras->0.280 ncnn->0.280
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.    0.1   0.    0.205 0.    0.462 0.    0.414 0.063 0.   ]
Ncnn Feature Map: 	[0.    0.1   0.    0.205 0.    0.462 0.    0.414 0.063 0.   ]
activation_4_Split
==================================
Layer Name: depthwise_conv2d_2, Layer Shape: keras->(1, 20, 20, 72) ncnn->(72, 20, 20)
Max: 	keras->0.717 ncnn->0.717 	Min: keras->-0.813 ncnn->-0.813
Mean: 	keras->-0.001 ncnn->-0.001 	Var: keras->0.165 ncnn->0.165
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.037  0.094  0.093 -0.019 -0.143  0.075 -0.019  0.045  0.069 -0.034]
Ncnn Feature Map: 	[ 0.037  0.094  0.093 -0.019 -0.143  0.075 -0.019  0.045  0.069 -0.034]
==================================
Layer Name: batch_normalization_6, Layer Shape: keras->(1, 20, 20, 72) ncnn->(72, 20, 20)
Max: 	keras->4.217 ncnn->4.217 	Min: keras->-4.771 ncnn->-4.771
Mean: 	keras->0.017 ncnn->0.017 	Var: keras->0.652 ncnn->0.652
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.341  1.237  1.209 -0.533 -2.466  0.928 -0.525  0.467  0.838 -0.763]
Ncnn Feature Map: 	[ 0.341  1.237  1.209 -0.533 -2.466  0.928 -0.525  0.467  0.838 -0.763]
activation_5_Clip
==================================
Layer Name: activation_5, Layer Shape: keras->(1, 20, 20, 72) ncnn->(72, 20, 20)
Max: 	keras->4.217 ncnn->4.217 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.262 ncnn->0.262 	Var: keras->0.379 ncnn->0.379
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.341 1.237 1.209 0.    0.    0.928 0.    0.467 0.838 0.   ]
Ncnn Feature Map: 	[0.341 1.237 1.209 0.    0.    0.928 0.    0.467 0.838 0.   ]
==================================
Layer Name: conv2d_5, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->2.272 ncnn->2.272 	Min: keras->-2.808 ncnn->-2.808
Mean: 	keras->-0.136 ncnn->-0.136 	Var: keras->0.747 ncnn->0.747
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.868 -0.985 -0.853 -2.025 -1.587 -1.621 -0.921 -0.725  0.099 -1.392]
Ncnn Feature Map: 	[-0.868 -0.985 -0.853 -2.025 -1.587 -1.621 -0.921 -0.725  0.099 -1.392]
==================================
Layer Name: batch_normalization_7, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->1.903 ncnn->1.903 	Min: keras->-2.691 ncnn->-2.691
Mean: 	keras->-0.022 ncnn->-0.022 	Var: keras->0.538 ncnn->0.538
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.094 -0.169 -0.084 -0.83  -0.552 -0.573 -0.128 -0.003  0.521 -0.428]
Ncnn Feature Map: 	[-0.094 -0.169 -0.084 -0.83  -0.552 -0.573 -0.128 -0.003  0.521 -0.428]
batch_normalization_7_Split
==================================
Layer Name: conv2d_6, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.724 ncnn->2.724 	Min: keras->-2.292 ncnn->-2.292
Mean: 	keras->0.219 ncnn->0.219 	Var: keras->0.609 ncnn->0.609
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.664  0.039  0.752  0.964  0.436 -0.067  0.377  0.277 -0.266  0.023]
Ncnn Feature Map: 	[ 0.664  0.039  0.752  0.964  0.436 -0.067  0.377  0.277 -0.266  0.023]
==================================
Layer Name: batch_normalization_8, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.595 ncnn->2.595 	Min: keras->-2.052 ncnn->-2.052
Mean: 	keras->0.289 ncnn->0.289 	Var: keras->0.550 ncnn->0.550
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.74   0.134  0.826  1.032  0.519  0.031  0.462  0.365 -0.162  0.118]
Ncnn Feature Map: 	[ 0.74   0.134  0.826  1.032  0.519  0.031  0.462  0.365 -0.162  0.118]
activation_6_Clip
==================================
Layer Name: activation_6, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.595 ncnn->2.595 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.394 ncnn->0.394 	Var: keras->0.410 ncnn->0.410
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.74  0.134 0.826 1.032 0.519 0.031 0.462 0.365 0.    0.118]
Ncnn Feature Map: 	[0.74  0.134 0.826 1.032 0.519 0.031 0.462 0.365 0.    0.118]
==================================
Layer Name: depthwise_conv2d_3, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->0.873 ncnn->0.873 	Min: keras->-1.067 ncnn->-1.067
Mean: 	keras->-0.028 ncnn->-0.028 	Var: keras->0.192 ncnn->0.192
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.036 -0.011  0.189 -0.088 -0.257 -0.096  0.082 -0.078 -0.062  0.206]
Ncnn Feature Map: 	[ 0.036 -0.011  0.189 -0.088 -0.257 -0.096  0.082 -0.078 -0.062  0.206]
==================================
Layer Name: batch_normalization_9, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.790 ncnn->2.790 	Min: keras->-4.029 ncnn->-4.029
Mean: 	keras->-0.024 ncnn->-0.024 	Var: keras->0.626 ncnn->0.626
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.176 -0.06   0.953 -0.452 -1.315 -0.496  0.41  -0.404 -0.322  1.041]
Ncnn Feature Map: 	[ 0.176 -0.06   0.953 -0.452 -1.315 -0.496  0.41  -0.404 -0.322  1.041]
activation_7_Clip
==================================
Layer Name: activation_7, Layer Shape: keras->(1, 20, 20, 88) ncnn->(88, 20, 20)
Max: 	keras->2.790 ncnn->2.790 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.232 ncnn->0.232 	Var: keras->0.348 ncnn->0.348
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.176 0.    0.953 0.    0.    0.    0.41  0.    0.    1.041]
Ncnn Feature Map: 	[0.176 0.    0.953 0.    0.    0.    0.41  0.    0.    1.041]
==================================
Layer Name: conv2d_7, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->3.889 ncnn->3.889 	Min: keras->-2.990 ncnn->-2.990
Mean: 	keras->-0.107 ncnn->-0.107 	Var: keras->0.735 ncnn->0.735
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.909 -0.839 -1.203  0.136 -1.033 -0.016 -0.018  0.113  0.438  0.264]
Ncnn Feature Map: 	[-0.909 -0.839 -1.203  0.136 -1.033 -0.016 -0.018  0.113  0.438  0.264]
==================================
Layer Name: batch_normalization_10, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->2.404 ncnn->2.404 	Min: keras->-2.459 ncnn->-2.459
Mean: 	keras->-0.139 ncnn->-0.139 	Var: keras->0.601 ncnn->0.601
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.128 -0.068 -0.381  0.773 -0.235  0.641  0.64   0.753  1.033  0.883]
Ncnn Feature Map: 	[-0.128 -0.068 -0.381  0.773 -0.235  0.641  0.64   0.753  1.033  0.883]
==================================
Layer Name: add_1, Layer Shape: keras->(1, 20, 20, 24) ncnn->(24, 20, 20)
Max: 	keras->3.039 ncnn->3.039 	Min: keras->-3.317 ncnn->-3.317
Mean: 	keras->-0.161 ncnn->-0.161 	Var: keras->0.886 ncnn->0.886
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.222 -0.236 -0.465 -0.058 -0.787  0.068  0.512  0.75   1.554  0.455]
Ncnn Feature Map: 	[-0.222 -0.236 -0.465 -0.058 -0.787  0.068  0.512  0.75   1.554  0.455]
==================================
Layer Name: conv2d_8, Layer Shape: keras->(1, 20, 20, 96) ncnn->(96, 20, 20)
Max: 	keras->4.521 ncnn->4.521 	Min: keras->-3.424 ncnn->-3.424
Mean: 	keras->0.315 ncnn->0.315 	Var: keras->0.967 ncnn->0.967
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 1.64   1.558  0.992 -0.494  0.449  1.736  2.111  2.111  2.707  1.929]
Ncnn Feature Map: 	[ 1.64   1.558  0.992 -0.494  0.449  1.736  2.111  2.111  2.707  1.929]
==================================
Layer Name: batch_normalization_11, Layer Shape: keras->(1, 20, 20, 96) ncnn->(96, 20, 20)
Max: 	keras->2.817 ncnn->2.817 	Min: keras->-2.428 ncnn->-2.428
Mean: 	keras->-0.062 ncnn->-0.062 	Var: keras->0.577 ncnn->0.577
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.646  0.602  0.297 -0.503  0.005  0.697  0.9    0.899  1.22   0.802]
Ncnn Feature Map: 	[ 0.646  0.602  0.297 -0.503  0.005  0.697  0.9    0.899  1.22   0.802]
==================================
Layer Name: activation_8, Layer Shape: keras->(1, 20, 20, 96) ncnn->(96, 20, 20)
Max: 	keras->2.731 ncnn->2.731 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.025 ncnn->0.025 	Var: keras->0.287 ncnn->0.287
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.393  0.361  0.163 -0.209  0.002  0.43   0.585  0.584  0.858  0.508]
Ncnn Feature Map: 	[ 0.393  0.361  0.163 -0.209  0.002  0.43   0.585  0.584  0.858  0.508]
activation_8_Split
==================================
Layer Name: depthwise_conv2d_4, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->1.250 ncnn->1.250 	Min: keras->-1.033 ncnn->-1.033
Mean: 	keras->0.014 ncnn->0.014 	Var: keras->0.253 ncnn->0.253
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.583 -0.209 -0.053  0.1    0.353  0.096  0.253  0.113 -0.061  0.236]
Ncnn Feature Map: 	[ 0.583 -0.209 -0.053  0.1    0.353  0.096  0.253  0.113 -0.061  0.236]
==================================
Layer Name: batch_normalization_12, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->3.476 ncnn->3.476 	Min: keras->-3.249 ncnn->-3.249
Mean: 	keras->0.039 ncnn->0.039 	Var: keras->0.605 ncnn->0.605
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 1.84  -0.872 -0.337  0.184  1.053  0.171  0.708  0.23  -0.365  0.652]
Ncnn Feature Map: 	[ 1.84  -0.872 -0.337  0.184  1.053  0.171  0.708  0.23  -0.365  0.652]
==================================
Layer Name: activation_9, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->3.476 ncnn->3.476 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.081 ncnn->0.081 	Var: keras->0.358 ncnn->0.358
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 1.485 -0.309 -0.149  0.097  0.711  0.09   0.437  0.124 -0.16   0.397]
Ncnn Feature Map: 	[ 1.485 -0.309 -0.149  0.097  0.711  0.09   0.437  0.124 -0.16   0.397]
activation_9_Split
==================================
Layer Name: global_average_pooling2d_2, Layer Shape: keras->(1, 96) ncnn->(1, 1, 96)
Max: 	keras->1.469 ncnn->1.469 	Min: keras->-0.300 ncnn->-0.300
Mean: 	keras->0.081 ncnn->0.081 	Var: keras->0.283 ncnn->0.283
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.    -0.051 -0.075  0.079 -0.171 -0.054 -0.042 -0.222  0.719 -0.257]
Ncnn Feature Map: 	[-0.    -0.051 -0.075  0.079 -0.171 -0.054 -0.042 -0.222  0.719 -0.257]
Top-k:
Keras Top-k: 	51:1.469, 80:0.972, 88:0.859, 8:0.719, 36:0.640
ncnn Top-k: 	51:1.469, 80:0.972, 88:0.859, 8:0.719, 36:0.640
==================================
Layer Name: dense_3, Layer Shape: keras->(1, 96) ncnn->(1, 1, 96)
Max: 	keras->1.656 ncnn->1.656 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.170 ncnn->0.170 	Var: keras->0.349 ncnn->0.349
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.    0.    0.    0.645 0.    0.    0.212 0.85  1.151 0.   ]
Ncnn Feature Map: 	[0.    0.    0.    0.645 0.    0.    0.212 0.85  1.151 0.   ]
Top-k:
Keras Top-k: 	49:1.656, 27:1.375, 8:1.151, 67:1.130, 72:0.977
ncnn Top-k: 	49:1.656, 27:1.375, 8:1.151, 67:1.130, 72:0.977
==================================
Layer Name: dense_4, Layer Shape: keras->(1, 96) ncnn->(1, 1, 96)
Max: 	keras->1.000 ncnn->3.115 	Min: keras->0.017 ncnn->-2.415
Mean: 	keras->0.643 ncnn->0.728 	Var: keras->0.230 ncnn->1.170
Cosine Similarity: 0.21749
Keras Feature Map: 	[0.484 0.804 0.896 0.017 0.127 0.605 0.735 0.521 0.778 0.656]
Ncnn Feature Map: 	[-0.079  1.519  1.979 -2.415 -1.864  0.525  1.175  0.104  1.389  0.779]
Top-k:
Keras Top-k: 	50:1.000, 90:1.000, 20:1.000, 46:1.000, 91:0.990
ncnn Top-k: 	46:3.115, 90:2.904, 20:2.611, 50:2.607, 91:2.449
dense_4_HardSigmoid
==================================
Layer Name: reshape_2, Layer Shape: keras->(1, 1, 1, 96) ncnn->(96, 1, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.017 ncnn->0.017
Mean: 	keras->0.643 ncnn->0.643 	Var: keras->0.230 ncnn->0.230
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.484]
Ncnn Feature Map: 	[0.484]
==================================
Layer Name: multiply_2, Layer Shape: keras->(1, 10, 10, 96) ncnn->(96, 10, 10)
Max: 	keras->2.636 ncnn->2.636 	Min: keras->-0.371 ncnn->-0.371
Mean: 	keras->0.050 ncnn->0.050 	Var: keras->0.244 ncnn->0.244
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.719 -0.15  -0.072  0.047  0.344  0.044  0.212  0.06  -0.078  0.192]
Ncnn Feature Map: 	[ 0.719 -0.15  -0.072  0.047  0.344  0.044  0.212  0.06  -0.078  0.192]
==================================
Layer Name: conv2d_9, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.295 ncnn->2.295 	Min: keras->-2.253 ncnn->-2.253
Mean: 	keras->-0.033 ncnn->-0.033 	Var: keras->0.576 ncnn->0.576
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.123 0.761 0.532 0.473 0.426 0.504 0.595 0.665 0.812 1.215]
Ncnn Feature Map: 	[1.123 0.761 0.532 0.473 0.426 0.504 0.595 0.665 0.812 1.215]
==================================
Layer Name: batch_normalization_13, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.738 ncnn->2.738 	Min: keras->-2.600 ncnn->-2.600
Mean: 	keras->-0.002 ncnn->-0.002 	Var: keras->0.708 ncnn->0.708
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.484 1.016 0.721 0.645 0.584 0.685 0.802 0.892 1.082 1.603]
Ncnn Feature Map: 	[1.484 1.016 0.721 0.645 0.584 0.685 0.802 0.892 1.082 1.603]
batch_normalization_13_Split
==================================
Layer Name: conv2d_10, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->4.744 ncnn->4.744 	Min: keras->-3.609 ncnn->-3.609
Mean: 	keras->0.088 ncnn->0.088 	Var: keras->1.002 ncnn->1.002
Cosine Similarity: -0.00000
Keras Feature Map: 	[1.362 1.119 1.398 1.128 1.381 0.941 2.454 1.242 1.175 1.389]
Ncnn Feature Map: 	[1.362 1.119 1.398 1.128 1.381 0.941 2.454 1.242 1.175 1.389]
==================================
Layer Name: batch_normalization_14, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.461 ncnn->3.461 	Min: keras->-3.410 ncnn->-3.410
Mean: 	keras->-0.019 ncnn->-0.019 	Var: keras->0.823 ncnn->0.823
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.896 0.734 0.92  0.74  0.909 0.616 1.623 0.816 0.772 0.914]
Ncnn Feature Map: 	[0.896 0.734 0.92  0.74  0.909 0.616 1.623 0.816 0.772 0.914]
==================================
Layer Name: activation_10, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.461 ncnn->3.461 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.103 ncnn->0.103 	Var: keras->0.436 ncnn->0.436
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.582 0.457 0.601 0.461 0.592 0.371 1.251 0.519 0.485 0.596]
Ncnn Feature Map: 	[0.582 0.457 0.601 0.461 0.592 0.371 1.251 0.519 0.485 0.596]
==================================
Layer Name: depthwise_conv2d_5, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->2.406 ncnn->2.406 	Min: keras->-2.650 ncnn->-2.650
Mean: 	keras->-0.031 ncnn->-0.031 	Var: keras->0.353 ncnn->0.353
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.925 -0.671 -0.34  -0.279 -0.423 -0.151 -0.409 -0.56  -0.118  0.067]
Ncnn Feature Map: 	[-0.925 -0.671 -0.34  -0.279 -0.423 -0.151 -0.409 -0.56  -0.118  0.067]
==================================
Layer Name: batch_normalization_15, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.631 ncnn->3.631 	Min: keras->-4.769 ncnn->-4.769
Mean: 	keras->-0.125 ncnn->-0.125 	Var: keras->0.692 ncnn->0.692
Cosine Similarity: -0.00000
Keras Feature Map: 	[-1.926 -1.327 -0.549 -0.405 -0.743 -0.102 -0.71  -1.067 -0.024  0.41 ]
Ncnn Feature Map: 	[-1.926 -1.327 -0.549 -0.405 -0.743 -0.102 -0.71  -1.067 -0.024  0.41 ]
==================================
Layer Name: activation_11, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.631 ncnn->3.631 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.020 ncnn->0.020 	Var: keras->0.323 ncnn->0.323
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.345 -0.37  -0.224 -0.175 -0.279 -0.049 -0.271 -0.344 -0.012  0.233]
Ncnn Feature Map: 	[-0.345 -0.37  -0.224 -0.175 -0.279 -0.049 -0.271 -0.344 -0.012  0.233]
activation_11_Split
==================================
Layer Name: global_average_pooling2d_3, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.059 ncnn->1.059 	Min: keras->-0.365 ncnn->-0.365
Mean: 	keras->0.020 ncnn->0.020 	Var: keras->0.233 ncnn->0.233
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.237 -0.214 -0.044 -0.226  0.362  0.044  0.119  0.052  0.215 -0.088]
Ncnn Feature Map: 	[-0.237 -0.214 -0.044 -0.226  0.362  0.044  0.119  0.052  0.215 -0.088]
Top-k:
Keras Top-k: 	40:1.059, 17:1.006, 188:0.954, 119:0.905, 58:0.748
ncnn Top-k: 	40:1.059, 17:1.006, 188:0.954, 119:0.905, 58:0.748
==================================
Layer Name: dense_5, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->2.975 ncnn->2.975 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.156 ncnn->0.156 	Var: keras->0.480 ncnn->0.480
Cosine Similarity: 0.00000
Keras Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Ncnn Feature Map: 	[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
Top-k:
Keras Top-k: 	135:2.975, 143:2.787, 89:2.622, 139:2.540, 66:2.306
ncnn Top-k: 	135:2.975, 143:2.787, 89:2.622, 139:2.540, 66:2.306
==================================
Layer Name: dense_6, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.000 ncnn->6.180 	Min: keras->0.000 ncnn->-4.345
Mean: 	keras->0.595 ncnn->0.662 	Var: keras->0.319 ncnn->2.133
Cosine Similarity: 0.30413
Keras Feature Map: 	[0.548 0.348 0.502 0.674 0.201 1.    0.79  0.    0.698 0.578]
Ncnn Feature Map: 	[ 0.24  -0.761  0.009  0.868 -1.495  3.343  1.448 -4.345  0.99   0.39 ]
Top-k:
Keras Top-k: 	119:1.000, 204:1.000, 213:1.000, 61:1.000, 211:1.000
ncnn Top-k: 	218:6.180, 68:6.154, 73:5.663, 209:5.245, 180:5.187
dense_6_HardSigmoid
==================================
Layer Name: reshape_3, Layer Shape: keras->(1, 1, 1, 240) ncnn->(240, 1, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.595 ncnn->0.595 	Var: keras->0.319 ncnn->0.319
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.548]
Ncnn Feature Map: 	[0.548]
==================================
Layer Name: multiply_3, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.631 ncnn->3.631 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.023 ncnn->0.023 	Var: keras->0.242 ncnn->0.242
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.189 -0.203 -0.123 -0.096 -0.153 -0.027 -0.148 -0.188 -0.007  0.128]
Ncnn Feature Map: 	[-0.189 -0.203 -0.123 -0.096 -0.153 -0.027 -0.148 -0.188 -0.007  0.128]
==================================
Layer Name: conv2d_11, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.978 ncnn->2.978 	Min: keras->-2.601 ncnn->-2.601
Mean: 	keras->0.063 ncnn->0.063 	Var: keras->0.872 ncnn->0.872
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.139 0.478 0.331 0.807 0.603 0.808 0.914 0.989 0.978 0.956]
Ncnn Feature Map: 	[1.139 0.478 0.331 0.807 0.603 0.808 0.914 0.989 0.978 0.956]
==================================
Layer Name: batch_normalization_16, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->2.349 ncnn->2.349 	Min: keras->-2.250 ncnn->-2.250
Mean: 	keras->0.059 ncnn->0.059 	Var: keras->0.691 ncnn->0.691
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.073 0.498 0.371 0.784 0.607 0.785 0.877 0.942 0.933 0.914]
Ncnn Feature Map: 	[1.073 0.498 0.371 0.784 0.607 0.785 0.877 0.942 0.933 0.914]
==================================
Layer Name: add_2, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->3.798 ncnn->3.798 	Min: keras->-3.213 ncnn->-3.213
Mean: 	keras->0.057 ncnn->0.057 	Var: keras->1.105 ncnn->1.105
Cosine Similarity: -0.00000
Keras Feature Map: 	[2.557 1.514 1.092 1.43  1.191 1.47  1.679 1.834 2.015 2.516]
Ncnn Feature Map: 	[2.557 1.514 1.092 1.43  1.191 1.47  1.679 1.834 2.015 2.516]
add_2_Split
==================================
Layer Name: conv2d_12, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->5.700 ncnn->5.700 	Min: keras->-5.146 ncnn->-5.146
Mean: 	keras->0.111 ncnn->0.111 	Var: keras->1.410 ncnn->1.410
Cosine Similarity: 0.00000
Keras Feature Map: 	[3.894 1.588 0.957 1.504 1.118 1.124 0.215 0.823 0.036 0.035]
Ncnn Feature Map: 	[3.894 1.588 0.957 1.504 1.118 1.124 0.215 0.823 0.036 0.035]
==================================
Layer Name: batch_normalization_17, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.964 ncnn->3.964 	Min: keras->-3.365 ncnn->-3.365
Mean: 	keras->-0.061 ncnn->-0.061 	Var: keras->0.872 ncnn->0.872
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.91  0.801 0.498 0.76  0.575 0.578 0.141 0.433 0.055 0.054]
Ncnn Feature Map: 	[1.91  0.801 0.498 0.76  0.575 0.578 0.141 0.433 0.055 0.054]
==================================
Layer Name: activation_12, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->3.964 ncnn->3.964 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.097 ncnn->0.097 	Var: keras->0.470 ncnn->0.470
Cosine Similarity: -0.00000
Keras Feature Map: 	[1.563 0.507 0.29  0.477 0.343 0.345 0.074 0.248 0.028 0.028]
Ncnn Feature Map: 	[1.563 0.507 0.29  0.477 0.343 0.345 0.074 0.248 0.028 0.028]
==================================
Layer Name: depthwise_conv2d_6, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->2.642 ncnn->2.642 	Min: keras->-3.390 ncnn->-3.390
Mean: 	keras->-0.014 ncnn->-0.014 	Var: keras->0.380 ncnn->0.380
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.132 -0.291 -0.139 -0.167 -0.239 -0.187 -0.056 -0.018 -0.054 -0.044]
Ncnn Feature Map: 	[-0.132 -0.291 -0.139 -0.167 -0.239 -0.187 -0.056 -0.018 -0.054 -0.044]
==================================
Layer Name: batch_normalization_18, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->5.341 ncnn->5.341 	Min: keras->-5.191 ncnn->-5.191
Mean: 	keras->-0.129 ncnn->-0.129 	Var: keras->0.785 ncnn->0.785
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.227 -0.571 -0.242 -0.302 -0.457 -0.345 -0.062  0.021 -0.058 -0.036]
Ncnn Feature Map: 	[-0.227 -0.571 -0.242 -0.302 -0.457 -0.345 -0.062  0.021 -0.058 -0.036]
==================================
Layer Name: activation_13, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->5.341 ncnn->5.341 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.038 ncnn->0.038 	Var: keras->0.412 ncnn->0.412
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.105 -0.231 -0.111 -0.136 -0.194 -0.153 -0.03   0.011 -0.029 -0.018]
Ncnn Feature Map: 	[-0.105 -0.231 -0.111 -0.136 -0.194 -0.153 -0.03   0.011 -0.029 -0.018]
activation_13_Split
==================================
Layer Name: global_average_pooling2d_4, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.583 ncnn->1.583 	Min: keras->-0.362 ncnn->-0.362
Mean: 	keras->0.038 ncnn->0.038 	Var: keras->0.295 ncnn->0.295
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.201  0.241  0.006 -0.306 -0.046 -0.099 -0.101  0.141  1.04   0.025]
Ncnn Feature Map: 	[-0.201  0.241  0.006 -0.306 -0.046 -0.099 -0.101  0.141  1.04   0.025]
Top-k:
Keras Top-k: 	213:1.583, 228:1.574, 191:1.236, 100:1.145, 8:1.040
ncnn Top-k: 	213:1.583, 228:1.574, 191:1.236, 100:1.145, 8:1.040
==================================
Layer Name: dense_7, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->3.623 ncnn->3.623 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.333 ncnn->0.333 	Var: keras->0.704 ncnn->0.704
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.    0.391 0.    0.    0.    0.    2.551 0.    0.258 1.287]
Ncnn Feature Map: 	[0.    0.391 0.    0.    0.    0.    2.551 0.    0.258 1.287]
Top-k:
Keras Top-k: 	42:3.623, 50:3.035, 25:2.848, 127:2.838, 177:2.741
ncnn Top-k: 	42:3.623, 50:3.035, 25:2.848, 127:2.838, 177:2.741
==================================
Layer Name: dense_8, Layer Shape: keras->(1, 240) ncnn->(1, 1, 240)
Max: 	keras->1.000 ncnn->15.710 	Min: keras->0.000 ncnn->-11.371
Mean: 	keras->0.621 ncnn->1.740 	Var: keras->0.419 ncnn->5.253
Cosine Similarity: 0.27101
Keras Feature Map: 	[0.275 1.    1.    1.    0.    0.    1.    0.925 1.    0.   ]
Ncnn Feature Map: 	[-1.127  4.679  4.563  5.134 -2.892 -4.67   4.582  2.127  7.942 -4.625]
Top-k:
Keras Top-k: 	239:1.000, 113:1.000, 107:1.000, 104:1.000, 101:1.000
ncnn Top-k: 	18:15.710, 154:14.086, 180:13.551, 48:13.237, 210:12.894
dense_8_HardSigmoid
==================================
Layer Name: reshape_4, Layer Shape: keras->(1, 1, 1, 240) ncnn->(240, 1, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.621 ncnn->0.621 	Var: keras->0.419 ncnn->0.419
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.275]
Ncnn Feature Map: 	[0.275]
==================================
Layer Name: multiply_4, Layer Shape: keras->(1, 10, 10, 240) ncnn->(240, 10, 10)
Max: 	keras->4.369 ncnn->4.369 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.049 ncnn->0.049 	Var: keras->0.337 ncnn->0.337
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.029 -0.063 -0.03  -0.037 -0.053 -0.042 -0.008  0.003 -0.008 -0.005]
Ncnn Feature Map: 	[-0.029 -0.063 -0.03  -0.037 -0.053 -0.042 -0.008  0.003 -0.008 -0.005]
==================================
Layer Name: conv2d_13, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->4.176 ncnn->4.176 	Min: keras->-4.651 ncnn->-4.651
Mean: 	keras->0.083 ncnn->0.083 	Var: keras->1.220 ncnn->1.220
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.774 -0.85  -0.52  -0.661 -0.649 -0.72  -1.355 -0.982 -0.45  -0.273]
Ncnn Feature Map: 	[-0.774 -0.85  -0.52  -0.661 -0.649 -0.72  -1.355 -0.982 -0.45  -0.273]
==================================
Layer Name: batch_normalization_19, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->4.505 ncnn->4.505 	Min: keras->-4.563 ncnn->-4.563
Mean: 	keras->0.061 ncnn->0.061 	Var: keras->1.175 ncnn->1.175
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.003 -1.077 -0.76  -0.895 -0.884 -0.952 -1.562 -1.204 -0.693 -0.522]
Ncnn Feature Map: 	[-1.003 -1.077 -0.76  -0.895 -0.884 -0.952 -1.562 -1.204 -0.693 -0.522]
==================================
Layer Name: add_3, Layer Shape: keras->(1, 10, 10, 40) ncnn->(40, 10, 10)
Max: 	keras->7.882 ncnn->7.882 	Min: keras->-5.970 ncnn->-5.970
Mean: 	keras->0.118 ncnn->0.118 	Var: keras->1.745 ncnn->1.745
Cosine Similarity: -0.00000
Keras Feature Map: 	[1.553 0.438 0.332 0.535 0.307 0.518 0.117 0.631 1.322 1.994]
Ncnn Feature Map: 	[1.553 0.438 0.332 0.535 0.307 0.518 0.117 0.631 1.322 1.994]
==================================
Layer Name: conv2d_14, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->9.958 ncnn->9.958 	Min: keras->-7.671 ncnn->-7.671
Mean: 	keras->-0.272 ncnn->-0.272 	Var: keras->2.322 ncnn->2.322
Cosine Similarity: -0.00000
Keras Feature Map: 	[-4.425 -0.283  0.195  0.222  0.584  0.037  0.234  0.584  1.577  3.714]
Ncnn Feature Map: 	[-4.425 -0.283  0.195  0.222  0.584  0.037  0.234  0.584  1.577  3.714]
==================================
Layer Name: batch_normalization_20, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->4.091 ncnn->4.091 	Min: keras->-4.205 ncnn->-4.205
Mean: 	keras->-0.268 ncnn->-0.268 	Var: keras->1.041 ncnn->1.041
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.385 -0.276 -0.148 -0.141 -0.044 -0.19  -0.137 -0.044  0.222  0.794]
Ncnn Feature Map: 	[-1.385 -0.276 -0.148 -0.141 -0.044 -0.19  -0.137 -0.044  0.222  0.794]
==================================
Layer Name: activation_14, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->4.091 ncnn->4.091 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.058 ncnn->0.058 	Var: keras->0.504 ncnn->0.504
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.373 -0.125 -0.07  -0.067 -0.021 -0.089 -0.065 -0.021  0.119  0.502]
Ncnn Feature Map: 	[-0.373 -0.125 -0.07  -0.067 -0.021 -0.089 -0.065 -0.021  0.119  0.502]
==================================
Layer Name: depthwise_conv2d_7, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->3.342 ncnn->3.342 	Min: keras->-4.673 ncnn->-4.673
Mean: 	keras->-0.039 ncnn->-0.039 	Var: keras->0.440 ncnn->0.440
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.106 -0.019 -0.066 -0.067 -0.05  -0.038  0.004  0.026  0.088  0.179]
Ncnn Feature Map: 	[-0.106 -0.019 -0.066 -0.067 -0.05  -0.038  0.004  0.026  0.088  0.179]
==================================
Layer Name: batch_normalization_21, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->5.117 ncnn->5.117 	Min: keras->-5.029 ncnn->-5.029
Mean: 	keras->-0.125 ncnn->-0.125 	Var: keras->0.819 ncnn->0.819
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.455 -0.165 -0.322 -0.323 -0.266 -0.228 -0.087 -0.013  0.193  0.498]
Ncnn Feature Map: 	[-0.455 -0.165 -0.322 -0.323 -0.266 -0.228 -0.087 -0.013  0.193  0.498]
==================================
Layer Name: activation_15, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->5.117 ncnn->5.117 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.049 ncnn->0.049 	Var: keras->0.452 ncnn->0.452
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.193 -0.078 -0.144 -0.144 -0.121 -0.105 -0.042 -0.006  0.103  0.29 ]
Ncnn Feature Map: 	[-0.193 -0.078 -0.144 -0.144 -0.121 -0.105 -0.042 -0.006  0.103  0.29 ]
activation_15_Split
==================================
Layer Name: global_average_pooling2d_5, Layer Shape: keras->(1, 120) ncnn->(1, 1, 120)
Max: 	keras->1.466 ncnn->1.466 	Min: keras->-0.333 ncnn->-0.333
Mean: 	keras->0.049 ncnn->0.049 	Var: keras->0.320 ncnn->0.320
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.002  0.26  -0.19   0.438 -0.068 -0.064 -0.163 -0.129 -0.124 -0.002]
Ncnn Feature Map: 	[-0.002  0.26  -0.19   0.438 -0.068 -0.064 -0.163 -0.129 -0.124 -0.002]
Top-k:
Keras Top-k: 	104:1.466, 39:1.301, 36:1.151, 46:1.072, 61:0.882
ncnn Top-k: 	104:1.466, 39:1.301, 36:1.151, 46:1.072, 61:0.882
==================================
Layer Name: dense_9, Layer Shape: keras->(1, 120) ncnn->(1, 1, 120)
Max: 	keras->2.904 ncnn->2.904 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.320 ncnn->0.320 	Var: keras->0.598 ncnn->0.598
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.13  1.226 0.    0.19  0.    0.167 0.639 0.905 0.    0.   ]
Ncnn Feature Map: 	[1.13  1.226 0.    0.19  0.    0.167 0.639 0.905 0.    0.   ]
Top-k:
Keras Top-k: 	46:2.904, 62:2.601, 114:2.218, 61:1.815, 13:1.787
ncnn Top-k: 	46:2.904, 62:2.601, 114:2.218, 61:1.815, 13:1.787
==================================
Layer Name: dense_10, Layer Shape: keras->(1, 120) ncnn->(1, 1, 120)
Max: 	keras->1.000 ncnn->7.559 	Min: keras->0.000 ncnn->-4.197
Mean: 	keras->0.717 ncnn->1.689 	Var: keras->0.321 ncnn->2.405
Cosine Similarity: 0.16729
Keras Feature Map: 	[0.799 0.902 0.785 1.    0.87  1.    0.246 0.883 0.248 0.041]
Ncnn Feature Map: 	[ 1.495  2.01   1.425  4.528  1.848  4.502 -1.271  1.913 -1.261 -2.296]
Top-k:
Keras Top-k: 	119:1.000, 74:1.000, 87:1.000, 26:1.000, 27:1.000
ncnn Top-k: 	69:7.559, 33:6.625, 46:6.592, 108:5.887, 77:5.853
dense_10_HardSigmoid
==================================
Layer Name: reshape_5, Layer Shape: keras->(1, 1, 1, 120) ncnn->(120, 1, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.717 ncnn->0.717 	Var: keras->0.321 ncnn->0.321
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.799]
Ncnn Feature Map: 	[0.799]
==================================
Layer Name: multiply_5, Layer Shape: keras->(1, 10, 10, 120) ncnn->(120, 10, 10)
Max: 	keras->4.772 ncnn->4.772 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.051 ncnn->0.051 	Var: keras->0.383 ncnn->0.383
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.154 -0.062 -0.115 -0.115 -0.097 -0.084 -0.034 -0.005  0.082  0.232]
Ncnn Feature Map: 	[-0.154 -0.062 -0.115 -0.115 -0.097 -0.084 -0.034 -0.005  0.082  0.232]
==================================
Layer Name: conv2d_15, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->3.744 ncnn->3.744 	Min: keras->-4.028 ncnn->-4.028
Mean: 	keras->-0.240 ncnn->-0.240 	Var: keras->0.997 ncnn->0.997
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.653  0.599  0.707  0.53   0.459  0.336  0.434  0.388  0.45   0.542]
Ncnn Feature Map: 	[-0.653  0.599  0.707  0.53   0.459  0.336  0.434  0.388  0.45   0.542]
==================================
Layer Name: batch_normalization_22, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->4.949 ncnn->4.949 	Min: keras->-4.603 ncnn->-4.603
Mean: 	keras->-0.269 ncnn->-0.269 	Var: keras->1.286 ncnn->1.286
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.844  0.691  0.823  0.606  0.519  0.369  0.488  0.433  0.508  0.621]
Ncnn Feature Map: 	[-0.844  0.691  0.823  0.606  0.519  0.369  0.488  0.433  0.508  0.621]
batch_normalization_22_Split
==================================
Layer Name: conv2d_16, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->7.216 ncnn->7.216 	Min: keras->-7.883 ncnn->-7.883
Mean: 	keras->-0.464 ncnn->-0.464 	Var: keras->1.972 ncnn->1.972
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.041 0.706 0.101 1.03  0.953 0.232 0.588 0.332 0.144 0.45 ]
Ncnn Feature Map: 	[0.041 0.706 0.101 1.03  0.953 0.232 0.588 0.332 0.144 0.45 ]
==================================
Layer Name: batch_normalization_23, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.700 ncnn->6.700 	Min: keras->-7.264 ncnn->-7.264
Mean: 	keras->-0.520 ncnn->-0.520 	Var: keras->1.427 ncnn->1.427
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.068 0.378 0.096 0.529 0.493 0.157 0.323 0.203 0.116 0.259]
Ncnn Feature Map: 	[0.068 0.378 0.096 0.529 0.493 0.157 0.323 0.203 0.116 0.259]
==================================
Layer Name: activation_16, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.700 ncnn->6.700 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.096 ncnn->0.096 	Var: keras->0.688 ncnn->0.688
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.035 0.213 0.049 0.311 0.287 0.083 0.179 0.109 0.06  0.141]
Ncnn Feature Map: 	[0.035 0.213 0.049 0.311 0.287 0.083 0.179 0.109 0.06  0.141]
==================================
Layer Name: depthwise_conv2d_8, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.010 ncnn->6.010 	Min: keras->-4.993 ncnn->-4.993
Mean: 	keras->-0.087 ncnn->-0.087 	Var: keras->0.608 ncnn->0.608
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.024 0.044 0.05  0.098 0.105 0.065 0.044 0.026 0.011 0.022]
Ncnn Feature Map: 	[0.024 0.044 0.05  0.098 0.105 0.065 0.044 0.026 0.011 0.022]
==================================
Layer Name: batch_normalization_24, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->10.067 ncnn->10.067 	Min: keras->-11.531 ncnn->-11.531
Mean: 	keras->-0.265 ncnn->-0.265 	Var: keras->1.153 ncnn->1.153
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.388 0.443 0.458 0.59  0.61  0.501 0.441 0.392 0.353 0.382]
Ncnn Feature Map: 	[0.388 0.443 0.458 0.59  0.61  0.501 0.441 0.392 0.353 0.382]
==================================
Layer Name: activation_17, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->10.067 ncnn->10.067 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.047 ncnn->0.047 	Var: keras->0.564 ncnn->0.564
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.219 0.254 0.264 0.353 0.367 0.292 0.253 0.222 0.197 0.215]
Ncnn Feature Map: 	[0.219 0.254 0.264 0.353 0.367 0.292 0.253 0.222 0.197 0.215]
activation_17_Split
==================================
Layer Name: global_average_pooling2d_6, Layer Shape: keras->(1, 144) ncnn->(1, 1, 144)
Max: 	keras->2.205 ncnn->2.205 	Min: keras->-0.340 ncnn->-0.340
Mean: 	keras->0.047 ncnn->0.047 	Var: keras->0.323 ncnn->0.323
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.095 -0.136  0.127 -0.124 -0.034 -0.042  0.629  0.487  1.073 -0.075]
Ncnn Feature Map: 	[-0.095 -0.136  0.127 -0.124 -0.034 -0.042  0.629  0.487  1.073 -0.075]
Top-k:
Keras Top-k: 	141:2.205, 70:1.097, 8:1.073, 111:0.933, 33:0.806
ncnn Top-k: 	141:2.205, 70:1.097, 8:1.073, 111:0.933, 33:0.806
==================================
Layer Name: dense_11, Layer Shape: keras->(1, 144) ncnn->(1, 1, 144)
Max: 	keras->2.647 ncnn->2.647 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.328 ncnn->0.328 	Var: keras->0.645 ncnn->0.645
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.621 0.254 0.    0.    0.155 0.    0.    0.    0.    0.   ]
Ncnn Feature Map: 	[0.621 0.254 0.    0.    0.155 0.    0.    0.    0.    0.   ]
Top-k:
Keras Top-k: 	119:2.647, 142:2.574, 16:2.534, 132:2.391, 106:2.380
ncnn Top-k: 	119:2.647, 142:2.574, 16:2.534, 132:2.391, 106:2.380
==================================
Layer Name: dense_12, Layer Shape: keras->(1, 144) ncnn->(1, 1, 144)
Max: 	keras->1.000 ncnn->8.605 	Min: keras->0.000 ncnn->-10.235
Mean: 	keras->0.530 ncnn->0.077 	Var: keras->0.415 ncnn->3.913
Cosine Similarity: 0.41166
Keras Feature Map: 	[0.33  0.423 1.    1.    0.    1.    1.    0.    0.796 0.271]
Ncnn Feature Map: 	[-0.85  -0.384  3.783  3.239 -2.865  2.806  6.584 -2.617  1.481 -1.147]
Top-k:
Keras Top-k: 	82:1.000, 120:1.000, 34:1.000, 117:1.000, 116:1.000
ncnn Top-k: 	21:8.605, 105:7.700, 117:7.567, 26:6.958, 80:6.822
dense_12_HardSigmoid
==================================
Layer Name: reshape_6, Layer Shape: keras->(1, 1, 1, 144) ncnn->(144, 1, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.530 ncnn->0.530 	Var: keras->0.415 ncnn->0.415
Cosine Similarity: -0.00000
Keras Feature Map: 	[0.33]
Ncnn Feature Map: 	[0.33]
==================================
Layer Name: multiply_6, Layer Shape: keras->(1, 10, 10, 144) ncnn->(144, 10, 10)
Max: 	keras->6.688 ncnn->6.688 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.032 ncnn->0.032 	Var: keras->0.350 ncnn->0.350
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.072 0.084 0.087 0.117 0.121 0.096 0.084 0.073 0.065 0.071]
Ncnn Feature Map: 	[0.072 0.084 0.087 0.117 0.121 0.096 0.084 0.073 0.065 0.071]
==================================
Layer Name: conv2d_17, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->4.038 ncnn->4.038 	Min: keras->-4.769 ncnn->-4.769
Mean: 	keras->-0.176 ncnn->-0.176 	Var: keras->0.900 ncnn->0.900
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.305  0.176 -0.256 -0.16  -0.497 -0.581 -0.501 -0.359 -0.058  0.113]
Ncnn Feature Map: 	[ 0.305  0.176 -0.256 -0.16  -0.497 -0.581 -0.501 -0.359 -0.058  0.113]
==================================
Layer Name: batch_normalization_25, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->3.686 ncnn->3.686 	Min: keras->-4.459 ncnn->-4.459
Mean: 	keras->-0.207 ncnn->-0.207 	Var: keras->1.059 ncnn->1.059
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.097 -0.21  -0.588 -0.504 -0.798 -0.872 -0.802 -0.677 -0.415 -0.265]
Ncnn Feature Map: 	[-0.097 -0.21  -0.588 -0.504 -0.798 -0.872 -0.802 -0.677 -0.415 -0.265]
==================================
Layer Name: add_4, Layer Shape: keras->(1, 10, 10, 48) ncnn->(48, 10, 10)
Max: 	keras->7.120 ncnn->7.120 	Min: keras->-6.300 ncnn->-6.300
Mean: 	keras->-0.476 ncnn->-0.476 	Var: keras->1.859 ncnn->1.859
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.942  0.481  0.235  0.102 -0.279 -0.503 -0.314 -0.245  0.093  0.356]
Ncnn Feature Map: 	[-0.942  0.481  0.235  0.102 -0.279 -0.503 -0.314 -0.245  0.093  0.356]
add_4_Split
==================================
Layer Name: conv2d_18, Layer Shape: keras->(1, 10, 10, 288) ncnn->(288, 10, 10)
Max: 	keras->12.497 ncnn->12.497 	Min: keras->-12.407 ncnn->-12.407
Mean: 	keras->0.207 ncnn->0.207 	Var: keras->3.007 ncnn->3.007
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.505  1.052  0.094  0.138  0.67  -0.059  0.083  0.104  0.547  0.749]
Ncnn Feature Map: 	[ 0.505  1.052  0.094  0.138  0.67  -0.059  0.083  0.104  0.547  0.749]
==================================
Layer Name: batch_normalization_26, Layer Shape: keras->(1, 10, 10, 288) ncnn->(288, 10, 10)
Max: 	keras->7.410 ncnn->7.410 	Min: keras->-6.151 ncnn->-6.151
Mean: 	keras->0.014 ncnn->0.014 	Var: keras->1.549 ncnn->1.549
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.204  0.422  0.039  0.057  0.27  -0.022  0.035  0.043  0.221  0.301]
Ncnn Feature Map: 	[ 0.204  0.422  0.039  0.057  0.27  -0.022  0.035  0.043  0.221  0.301]
==================================
Layer Name: activation_18, Layer Shape: keras->(1, 10, 10, 288) ncnn->(288, 10, 10)
Max: 	keras->7.410 ncnn->7.410 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.378 ncnn->0.378 	Var: keras->0.953 ncnn->0.953
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.109  0.241  0.02   0.029  0.147 -0.011  0.018  0.022  0.118  0.166]
Ncnn Feature Map: 	[ 0.109  0.241  0.02   0.029  0.147 -0.011  0.018  0.022  0.118  0.166]
==================================
Layer Name: depthwise_conv2d_9, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->6.978 ncnn->6.978 	Min: keras->-7.315 ncnn->-7.315
Mean: 	keras->-0.217 ncnn->-0.217 	Var: keras->1.075 ncnn->1.075
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.192 -0.179 -0.087 -0.08   0.14 ]
Ncnn Feature Map: 	[ 0.192 -0.179 -0.087 -0.08   0.14 ]
==================================
Layer Name: batch_normalization_27, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->7.376 ncnn->7.376 	Min: keras->-12.403 ncnn->-12.403
Mean: 	keras->-0.373 ncnn->-0.373 	Var: keras->1.500 ncnn->1.500
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.206 -0.723 -0.491 -0.475  0.076]
Ncnn Feature Map: 	[ 0.206 -0.723 -0.491 -0.475  0.076]
==================================
Layer Name: activation_19, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->7.376 ncnn->7.376 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.097 ncnn->0.097 	Var: keras->0.704 ncnn->0.704
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.11  -0.274 -0.205 -0.2    0.039]
Ncnn Feature Map: 	[ 0.11  -0.274 -0.205 -0.2    0.039]
activation_19_Split
==================================
Layer Name: global_average_pooling2d_7, Layer Shape: keras->(1, 288) ncnn->(1, 1, 288)
Max: 	keras->3.451 ncnn->3.451 	Min: keras->-0.361 ncnn->-0.361
Mean: 	keras->0.097 ncnn->0.097 	Var: keras->0.491 ncnn->0.491
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.059  0.506 -0.078  0.531 -0.091 -0.09   0.016 -0.289 -0.083 -0.189]
Ncnn Feature Map: 	[ 0.059  0.506 -0.078  0.531 -0.091 -0.09   0.016 -0.289 -0.083 -0.189]
Top-k:
Keras Top-k: 	87:3.451, 224:3.258, 81:2.399, 188:2.102, 91:2.073
ncnn Top-k: 	87:3.451, 224:3.258, 81:2.399, 188:2.102, 91:2.073
==================================
Layer Name: dense_13, Layer Shape: keras->(1, 288) ncnn->(1, 1, 288)
Max: 	keras->5.375 ncnn->5.375 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.633 ncnn->0.633 	Var: keras->1.280 ncnn->1.280
Cosine Similarity: 0.00000
Keras Feature Map: 	[5.129 0.    0.    0.    3.012 0.624 0.434 4.075 0.    0.033]
Ncnn Feature Map: 	[5.129 0.    0.    0.    3.012 0.624 0.434 4.075 0.    0.033]
Top-k:
Keras Top-k: 	265:5.375, 17:5.367, 107:5.279, 35:5.271, 0:5.129
ncnn Top-k: 	265:5.375, 17:5.367, 107:5.279, 35:5.271, 0:5.129
==================================
Layer Name: dense_14, Layer Shape: keras->(1, 288) ncnn->(1, 1, 288)
Max: 	keras->1.000 ncnn->27.330 	Min: keras->0.000 ncnn->-26.914
Mean: 	keras->0.448 ncnn->-1.352 	Var: keras->0.483 ncnn->13.048
Cosine Similarity: 0.41955
Keras Feature Map: 	[1. 0. 0. 1. 1. 0. 1. 0. 1. 1.]
Ncnn Feature Map: 	[  8.731  -5.449  -3.711  10.984  13.982 -11.052   5.529 -12.625  15.525
  19.873]
Top-k:
Keras Top-k: 	0:1.000, 95:1.000, 110:1.000, 108:1.000, 107:1.000
ncnn Top-k: 	28:27.330, 55:25.747, 189:24.945, 162:24.892, 149:24.634
dense_14_HardSigmoid
==================================
Layer Name: reshape_7, Layer Shape: keras->(1, 1, 1, 288) ncnn->(288, 1, 1)
Max: 	keras->1.000 ncnn->1.000 	Min: keras->0.000 ncnn->0.000
Mean: 	keras->0.448 ncnn->0.448 	Var: keras->0.483 ncnn->0.483
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.]
Ncnn Feature Map: 	[1.]
==================================
Layer Name: multiply_7, Layer Shape: keras->(1, 5, 5, 288) ncnn->(288, 5, 5)
Max: 	keras->7.376 ncnn->7.376 	Min: keras->-0.375 ncnn->-0.375
Mean: 	keras->0.055 ncnn->0.055 	Var: keras->0.475 ncnn->0.475
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.11  -0.274 -0.205 -0.2    0.039]
Ncnn Feature Map: 	[ 0.11  -0.274 -0.205 -0.2    0.039]
==================================
Layer Name: conv2d_19, Layer Shape: keras->(1, 5, 5, 96) ncnn->(96, 5, 5)
Max: 	keras->6.916 ncnn->6.916 	Min: keras->-7.068 ncnn->-7.068
Mean: 	keras->-0.109 ncnn->-0.109 	Var: keras->1.848 ncnn->1.848
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.74  -0.774 -1.223 -1.44  -1.283]
Ncnn Feature Map: 	[-0.74  -0.774 -1.223 -1.44  -1.283]
==================================
Layer Name: batch_normalization_28, Layer Shape: keras->(1, 5, 5, 96) ncnn->(96, 5, 5)
Max: 	keras->4.018 ncnn->4.018 	Min: keras->-4.341 ncnn->-4.341
Mean: 	keras->-0.049 ncnn->-0.049 	Var: keras->0.970 ncnn->0.970
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.08  -0.094 -0.283 -0.374 -0.308]
Ncnn Feature Map: 	[-0.08  -0.094 -0.283 -0.374 -0.308]
==================================
Layer Name: conv_pw_mobile_bottleneck, Layer Shape: keras->(1, 5, 5, 32) ncnn->(32, 5, 5)
Max: 	keras->11.915 ncnn->11.915 	Min: keras->-8.413 ncnn->-8.413
Mean: 	keras->0.622 ncnn->0.622 	Var: keras->3.560 ncnn->3.560
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.86  -4.268 -3.909 -3.923 -3.8  ]
Ncnn Feature Map: 	[-1.86  -4.268 -3.909 -3.923 -3.8  ]
==================================
Layer Name: up_sampling2d_1, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->11.915 ncnn->11.915 	Min: keras->-8.413 ncnn->-8.413
Mean: 	keras->0.622 ncnn->0.622 	Var: keras->3.560 ncnn->3.560
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.86  -1.86  -4.268 -4.268 -3.909 -3.909 -3.923 -3.923 -3.8   -3.8  ]
Ncnn Feature Map: 	[-1.86  -1.86  -4.268 -4.268 -3.909 -3.909 -3.923 -3.923 -3.8   -3.8  ]
==================================
Layer Name: conv_pw_upblock_4, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->21.250 ncnn->21.250 	Min: keras->-21.531 ncnn->-21.531
Mean: 	keras->0.468 ncnn->0.468 	Var: keras->5.509 ncnn->5.509
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.381 -6.175 -7.394 -7.318 -6.163 -4.364 -5.653 -5.272 -4.099 -3.886]
Ncnn Feature Map: 	[-1.381 -6.175 -7.394 -7.318 -6.163 -4.364 -5.653 -5.272 -4.099 -3.886]
==================================
Layer Name: add_5, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->30.777 ncnn->30.777 	Min: keras->-29.944 ncnn->-29.944
Mean: 	keras->1.090 ncnn->1.090 	Var: keras->8.069 ncnn->8.069
Cosine Similarity: 0.00000
Keras Feature Map: 	[ -3.241  -8.035 -11.661 -11.586 -10.072  -8.273  -9.576  -9.195  -7.899
  -7.686]
Ncnn Feature Map: 	[ -3.241  -8.035 -11.661 -11.586 -10.072  -8.273  -9.576  -9.195  -7.899
  -7.686]
sep_upblock_4b_dw
==================================
Layer Name: sep_upblock_4b, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->29.722 ncnn->29.722 	Min: keras->-32.109 ncnn->-32.109
Mean: 	keras->1.375 ncnn->1.375 	Var: keras->10.522 ncnn->10.522
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 2.262  0.189 -2.102 -0.867 -1.894 -1.515 -1.272 -0.599 -1.433  1.563]
Ncnn Feature Map: 	[ 2.262  0.189 -2.102 -0.867 -1.894 -1.515 -1.272 -0.599 -1.433  1.563]
==================================
Layer Name: batch_normalization_29, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->5.389 ncnn->5.389 	Min: keras->-3.505 ncnn->-3.505
Mean: 	keras->0.564 ncnn->0.564 	Var: keras->1.728 ncnn->1.728
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.52   0.117 -0.328 -0.088 -0.287 -0.214 -0.166 -0.036 -0.198  0.384]
Ncnn Feature Map: 	[ 0.52   0.117 -0.328 -0.088 -0.287 -0.214 -0.166 -0.036 -0.198  0.384]
==================================
Layer Name: leaky_re_lu_1, Layer Shape: keras->(1, 10, 10, 32) ncnn->(32, 10, 10)
Max: 	keras->5.389 ncnn->5.389 	Min: keras->-1.051 ncnn->-1.051
Mean: 	keras->0.888 ncnn->0.888 	Var: keras->1.363 ncnn->1.363
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.52   0.117 -0.098 -0.026 -0.086 -0.064 -0.05  -0.011 -0.059  0.384]
Ncnn Feature Map: 	[ 0.52   0.117 -0.098 -0.026 -0.086 -0.064 -0.05  -0.011 -0.059  0.384]
==================================
Layer Name: up_sampling2d_2, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->5.389 ncnn->5.389 	Min: keras->-1.051 ncnn->-1.051
Mean: 	keras->0.888 ncnn->0.888 	Var: keras->1.363 ncnn->1.363
Cosine Similarity: -0.00000
Keras Feature Map: 	[ 0.52   0.52   0.117  0.117 -0.098 -0.098 -0.026 -0.026 -0.086 -0.086]
Ncnn Feature Map: 	[ 0.52   0.52   0.117  0.117 -0.098 -0.098 -0.026 -0.026 -0.086 -0.086]
==================================
Layer Name: conv_pw_upblock_3, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->5.566 ncnn->5.566 	Min: keras->-3.908 ncnn->-3.908
Mean: 	keras->0.097 ncnn->0.097 	Var: keras->0.955 ncnn->0.955
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.962 -1.737 -1.694  0.301 -0.486 -0.831 -1.39  -1.486 -1.964 -1.47 ]
Ncnn Feature Map: 	[-0.962 -1.737 -1.694  0.301 -0.486 -0.831 -1.39  -1.486 -1.964 -1.47 ]
==================================
Layer Name: add_6, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->6.228 ncnn->6.228 	Min: keras->-3.372 ncnn->-3.372
Mean: 	keras->0.985 ncnn->0.985 	Var: keras->1.653 ncnn->1.653
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.442 -1.217 -1.576  0.418 -0.584 -0.93  -1.416 -1.512 -2.05  -1.556]
Ncnn Feature Map: 	[-0.442 -1.217 -1.576  0.418 -0.584 -0.93  -1.416 -1.512 -2.05  -1.556]
sep_upblock_3b_dw
==================================
Layer Name: sep_upblock_3b, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->9.169 ncnn->9.169 	Min: keras->-8.391 ncnn->-8.391
Mean: 	keras->-0.042 ncnn->-0.042 	Var: keras->2.755 ncnn->2.755
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.64  -1.078 -0.728 -0.442 -0.555 -0.791 -1.027 -0.981 -1.432 -1.23 ]
Ncnn Feature Map: 	[-0.64  -1.078 -0.728 -0.442 -0.555 -0.791 -1.027 -0.981 -1.432 -1.23 ]
==================================
Layer Name: batch_normalization_30, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->4.921 ncnn->4.921 	Min: keras->-3.521 ncnn->-3.521
Mean: 	keras->0.309 ncnn->0.309 	Var: keras->1.435 ncnn->1.435
Cosine Similarity: -0.00000
Keras Feature Map: 	[-0.427 -0.697 -0.481 -0.305 -0.374 -0.52  -0.666 -0.638 -0.916 -0.792]
Ncnn Feature Map: 	[-0.427 -0.697 -0.481 -0.305 -0.374 -0.52  -0.666 -0.638 -0.916 -0.792]
==================================
Layer Name: leaky_re_lu_2, Layer Shape: keras->(1, 20, 20, 32) ncnn->(32, 20, 20)
Max: 	keras->4.921 ncnn->4.921 	Min: keras->-1.056 ncnn->-1.056
Mean: 	keras->0.619 ncnn->0.619 	Var: keras->1.085 ncnn->1.085
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.128 -0.209 -0.144 -0.091 -0.112 -0.156 -0.2   -0.191 -0.275 -0.237]
Ncnn Feature Map: 	[-0.128 -0.209 -0.144 -0.091 -0.112 -0.156 -0.2   -0.191 -0.275 -0.237]
==================================
Layer Name: up_sampling2d_3, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->4.921 ncnn->4.921 	Min: keras->-1.056 ncnn->-1.056
Mean: 	keras->0.619 ncnn->0.619 	Var: keras->1.085 ncnn->1.085
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.128 -0.128 -0.209 -0.209 -0.144 -0.144 -0.091 -0.091 -0.112 -0.112]
Ncnn Feature Map: 	[-0.128 -0.128 -0.209 -0.209 -0.144 -0.144 -0.091 -0.091 -0.112 -0.112]
==================================
Layer Name: conv_pw_upblock_2, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->3.356 ncnn->3.356 	Min: keras->-3.982 ncnn->-3.982
Mean: 	keras->-0.063 ncnn->-0.063 	Var: keras->0.891 ncnn->0.891
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.227 -1.509 -1.472 -0.858 -2.254 -2.312 -0.813 -2.581 -1.463 -0.964]
Ncnn Feature Map: 	[-1.227 -1.509 -1.472 -0.858 -2.254 -2.312 -0.813 -2.581 -1.463 -0.964]
==================================
Layer Name: add_7, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->5.888 ncnn->5.888 	Min: keras->-3.403 ncnn->-3.403
Mean: 	keras->0.557 ncnn->0.557 	Var: keras->1.405 ncnn->1.405
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.355 -1.637 -1.682 -1.067 -2.398 -2.457 -0.905 -2.672 -1.575 -1.076]
Ncnn Feature Map: 	[-1.355 -1.637 -1.682 -1.067 -2.398 -2.457 -0.905 -2.672 -1.575 -1.076]
sep_upblock_2b_dw
==================================
Layer Name: sep_upblock_2b, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->9.756 ncnn->9.756 	Min: keras->-10.530 ncnn->-10.530
Mean: 	keras->-0.585 ncnn->-0.585 	Var: keras->2.981 ncnn->2.981
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.937 0.676 0.4   0.494 0.027 0.212 0.549 0.259 0.569 0.098]
Ncnn Feature Map: 	[0.937 0.676 0.4   0.494 0.027 0.212 0.549 0.259 0.569 0.098]
==================================
Layer Name: batch_normalization_31, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->4.175 ncnn->4.175 	Min: keras->-3.787 ncnn->-3.787
Mean: 	keras->-0.035 ncnn->-0.035 	Var: keras->1.239 ncnn->1.239
Cosine Similarity: -0.00000
Keras Feature Map: 	[1.164 1.046 0.921 0.964 0.753 0.836 0.989 0.857 0.998 0.784]
Ncnn Feature Map: 	[1.164 1.046 0.921 0.964 0.753 0.836 0.989 0.857 0.998 0.784]
==================================
Layer Name: leaky_re_lu_3, Layer Shape: keras->(1, 40, 40, 32) ncnn->(32, 40, 40)
Max: 	keras->4.175 ncnn->4.175 	Min: keras->-1.136 ncnn->-1.136
Mean: 	keras->0.337 ncnn->0.337 	Var: keras->0.877 ncnn->0.877
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.164 1.046 0.921 0.964 0.753 0.836 0.989 0.857 0.998 0.784]
Ncnn Feature Map: 	[1.164 1.046 0.921 0.964 0.753 0.836 0.989 0.857 0.998 0.784]
==================================
Layer Name: up_sampling2d_4, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->4.175 ncnn->4.175 	Min: keras->-1.136 ncnn->-1.136
Mean: 	keras->0.337 ncnn->0.337 	Var: keras->0.877 ncnn->0.877
Cosine Similarity: 0.00000
Keras Feature Map: 	[1.164 1.164 1.046 1.046 0.921 0.921 0.964 0.964 0.753 0.753]
Ncnn Feature Map: 	[1.164 1.164 1.046 1.046 0.921 0.921 0.964 0.964 0.753 0.753]
==================================
Layer Name: conv_pw_upblock_1, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->7.299 ncnn->7.299 	Min: keras->-9.117 ncnn->-9.117
Mean: 	keras->-0.319 ncnn->-0.319 	Var: keras->1.322 ncnn->1.322
Cosine Similarity: 0.00000
Keras Feature Map: 	[-1.013 -3.41  -1.978 -1.793 -1.022 -1.154 -1.294 -2.431 -1.356 -3.239]
Ncnn Feature Map: 	[-1.013 -3.41  -1.978 -1.793 -1.022 -1.154 -1.294 -2.431 -1.356 -3.239]
==================================
Layer Name: add_8, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->9.493 ncnn->9.493 	Min: keras->-8.234 ncnn->-8.234
Mean: 	keras->0.017 ncnn->0.017 	Var: keras->1.594 ncnn->1.594
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.151 -2.246 -0.932 -0.747 -0.101 -0.233 -0.331 -1.467 -0.604 -2.486]
Ncnn Feature Map: 	[ 0.151 -2.246 -0.932 -0.747 -0.101 -0.233 -0.331 -1.467 -0.604 -2.486]
sep_upblock_1b_dw
==================================
Layer Name: sep_upblock_1b, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->6.637 ncnn->6.637 	Min: keras->-10.818 ncnn->-10.818
Mean: 	keras->-2.007 ncnn->-2.007 	Var: keras->2.186 ncnn->2.186
Cosine Similarity: 0.00000
Keras Feature Map: 	[-0.277 -2.182 -2.387 -1.413 -0.582 -0.511 -0.805 -1.044 -0.955 -1.492]
Ncnn Feature Map: 	[-0.277 -2.182 -2.387 -1.413 -0.582 -0.511 -0.805 -1.044 -0.955 -1.492]
==================================
Layer Name: batch_normalization_32, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->3.452 ncnn->3.452 	Min: keras->-5.299 ncnn->-5.299
Mean: 	keras->-0.513 ncnn->-0.513 	Var: keras->1.019 ncnn->1.019
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.19  -0.831 -0.941 -0.419  0.027  0.064 -0.093 -0.221 -0.173 -0.461]
Ncnn Feature Map: 	[ 0.19  -0.831 -0.941 -0.419  0.027  0.064 -0.093 -0.221 -0.173 -0.461]
==================================
Layer Name: leaky_re_lu_4, Layer Shape: keras->(1, 80, 80, 32) ncnn->(32, 80, 80)
Max: 	keras->3.452 ncnn->3.452 	Min: keras->-1.590 ncnn->-1.590
Mean: 	keras->-0.022 ncnn->-0.022 	Var: keras->0.543 ncnn->0.543
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.19  -0.249 -0.282 -0.126  0.027  0.064 -0.028 -0.066 -0.052 -0.138]
Ncnn Feature Map: 	[ 0.19  -0.249 -0.282 -0.126  0.027  0.064 -0.028 -0.066 -0.052 -0.138]
==================================
Layer Name: up_sampling2d_5, Layer Shape: keras->(1, 160, 160, 32) ncnn->(32, 160, 160)
Max: 	keras->3.452 ncnn->3.452 	Min: keras->-1.590 ncnn->-1.590
Mean: 	keras->-0.022 ncnn->-0.022 	Var: keras->0.543 ncnn->0.543
Cosine Similarity: 0.00000
Keras Feature Map: 	[ 0.19   0.19  -0.249 -0.249 -0.282 -0.282 -0.126 -0.126  0.027  0.027]
Ncnn Feature Map: 	[ 0.19   0.19  -0.249 -0.249 -0.282 -0.282 -0.126 -0.126  0.027  0.027]
==================================
Layer Name: final_layer, Layer Shape: keras->(1, 160, 160, 2) ncnn->(2, 160, 160)
Max: 	keras->0.998 ncnn->3.824 	Min: keras->0.002 ncnn->-2.967
Mean: 	keras->0.500 ncnn->0.272 	Var: keras->0.369 ncnn->1.321
Cosine Similarity: 0.28580
Keras Feature Map: 	[0.029 0.029 0.103 0.103 0.113 0.113 0.181 0.181 0.209 0.209]
Ncnn Feature Map: 	[-0.974 -0.974 -0.558 -0.558 -0.359 -0.359 -0.101 -0.101 -0.062 -0.062]
final_layer_Softmax
==================================
Layer Name: final_layer_Softmax, Layer Shape: keras->(1, 160, 160, 2) ncnn->(2, 160, 160)
Max: 	keras->0.998 ncnn->0.998 	Min: keras->0.002 ncnn->0.002
Mean: 	keras->0.500 ncnn->0.500 	Var: keras->0.369 ncnn->0.369
Cosine Similarity: 0.00000
Keras Feature Map: 	[0.029 0.029 0.103 0.103 0.113 0.113 0.181 0.181 0.209 0.209]
Ncnn Feature Map: 	[0.029 0.029 0.103 0.103 0.113 0.113 0.181 0.181 0.209 0.209]
Done!
ncnn模型推理完mask直接全黑,跟之前问题一样
我这个是头发分割模型,你能用人像图做输入看看结果吗?
呃,行,我看看