MMdnn icon indicating copy to clipboard operation
MMdnn copied to clipboard

tensorflow openpose convert to caffe failed

Open iChiaGuo opened this issue 6 years ago • 5 comments

Platform (like ubuntu 16.04/win10):ubuntu 16.04

Python version:2.7

Source framework with version (like Tensorflow 1.4.1 with GPU):Tensorflow 1.8

Destination framework with version (like CNTK 2.3 with GPU):

Pre-trained model path (webpath or webdisk path): http://www.mediafire.com/file/1pyjsjl0p93x27c/graph_freeze.pb

###this is net ###

Running scripts: mmconvert -sf tensorflow -iw graph_freeze.pb --inNodeName image --inputShape 368,432,3 --dstNodeName Openpose/concat_stage7 -df caffe -om tf_openpose

Check failed: top_shape[j] == bottom[i]->shape(j) (46 vs. 45) All inputs must have the same shape, except at concat_axis. But ,I can use this graph_freeze.pb in tensorflow infference right result.

log as follow: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA IR network structure is saved as [8c581b98527a41608f660bc97248d906.json]. IR network structure is saved as [8c581b98527a41608f660bc97248d906.pb]. IR weights are saved as [8c581b98527a41608f660bc97248d906.npy]. Parse file [8c581b98527a41608f660bc97248d906.pb] with binary format successfully. Target network code snippet is saved as [8c581b98527a41608f660bc97248d906.py]. Target weights are saved as [8c581b98527a41608f660bc97248d906.npy]. WARNING: Logging before InitGoogleLogging() is written to STDERR I0726 16:15:38.266441 10279 net.cpp:58] Initializing net from parameters: state { phase: TRAIN level: 0 } layer { name: "Placeholder" type: "Input" top: "Placeholder" input_param { shape { dim: 1 dim: 3 dim: 368 dim: 432 } } } layer { name: "MobilenetV1_Conv2d_0_Conv2D" type: "Convolution" bottom: "Placeholder" top: "MobilenetV1_Conv2d_0_Conv2D" convolution_param { num_output: 24 bias_term: false group: 1 stride: 2 pad_h: 0 pad_w: 0 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_0_Conv2D" top: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_0_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_1_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_1_depthwise_depthwise" convolution_param { num_output: 24 bias_term: false group: 24 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_1_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_1_depthwise_depthwise" top: "MobilenetV1_Conv2d_1_pointwise_Conv2D" convolution_param { num_output: 48 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_1_pointwise_Conv2D" top: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_1_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_2_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_2_depthwise_depthwise" convolution_param { num_output: 48 bias_term: false group: 48 stride: 2 pad_h: 0 pad_w: 0 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_2_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_2_depthwise_depthwise" top: "MobilenetV1_Conv2d_2_pointwise_Conv2D" convolution_param { num_output: 96 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_2_pointwise_Conv2D" top: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_2_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_3_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_3_depthwise_depthwise" convolution_param { num_output: 96 bias_term: false group: 96 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_3_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_3_depthwise_depthwise" top: "MobilenetV1_Conv2d_3_pointwise_Conv2D" convolution_param { num_output: 96 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_3_pointwise_Conv2D" top: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_3_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Conv2d_3_pool" type: "Pooling" bottom: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" top: "Conv2d_3_pool" pooling_param { pool: MAX kernel_size: 2 stride: 2 pad_h: 0 pad_w: 0 } } layer { name: "MobilenetV1_Conv2d_4_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_4_depthwise_depthwise" convolution_param { num_output: 96 bias_term: false group: 96 stride: 2 pad_h: 0 pad_w: 0 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_4_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_4_depthwise_depthwise" top: "MobilenetV1_Conv2d_4_pointwise_Conv2D" convolution_param { num_output: 192 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_4_pointwise_Conv2D" top: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_4_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_5_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_5_depthwise_depthwise" convolution_param { num_output: 192 bias_term: false group: 192 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_5_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_5_depthwise_depthwise" top: "MobilenetV1_Conv2d_5_pointwise_Conv2D" convolution_param { num_output: 192 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_5_pointwise_Conv2D" top: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_5_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_6_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_6_depthwise_depthwise" convolution_param { num_output: 192 bias_term: false group: 192 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_6_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_6_depthwise_depthwise" top: "MobilenetV1_Conv2d_6_pointwise_Conv2D" convolution_param { num_output: 384 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_6_pointwise_Conv2D" top: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_6_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_7_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_7_depthwise_depthwise" convolution_param { num_output: 384 bias_term: false group: 384 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_7_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_7_depthwise_depthwise" top: "MobilenetV1_Conv2d_7_pointwise_Conv2D" convolution_param { num_output: 384 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_7_pointwise_Conv2D" top: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_7_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_8_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_8_depthwise_depthwise" convolution_param { num_output: 384 bias_term: false group: 384 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_8_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_8_depthwise_depthwise" top: "MobilenetV1_Conv2d_8_pointwise_Conv2D" convolution_param { num_output: 384 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_8_pointwise_Conv2D" top: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_8_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_9_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_9_depthwise_depthwise" convolution_param { num_output: 384 bias_term: false group: 384 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_9_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_9_depthwise_depthwise" top: "MobilenetV1_Conv2d_9_pointwise_Conv2D" convolution_param { num_output: 384 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_9_pointwise_Conv2D" top: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_9_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_10_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_10_depthwise_depthwise" convolution_param { num_output: 384 bias_term: false group: 384 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_10_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_10_depthwise_depthwise" top: "MobilenetV1_Conv2d_10_pointwise_Conv2D" convolution_param { num_output: 384 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_10_pointwise_Conv2D" top: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_10_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "MobilenetV1_Conv2d_11_depthwise_depthwise" type: "Convolution" bottom: "MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_11_depthwise_depthwise" convolution_param { num_output: 384 bias_term: false group: 384 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "MobilenetV1_Conv2d_11_pointwise_Conv2D" type: "Convolution" bottom: "MobilenetV1_Conv2d_11_depthwise_depthwise" top: "MobilenetV1_Conv2d_11_pointwise_Conv2D" convolution_param { num_output: 384 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "MobilenetV1_Conv2d_11_pointwise_Conv2D" top: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "MobilenetV1_Conv2d_11_pointwise_Relu" type: "ReLU" bottom: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" top: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "feat_concat" type: "Concat" bottom: "Conv2d_3_pool" bottom: "MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm" bottom: "MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm" top: "feat_concat" concat_param { axis: 1 } } layer { name: "Openpose_MConv_Stage1_L2_1_depthwise_depthwise" type: "Convolution" bottom: "feat_concat" top: "Openpose_MConv_Stage1_L2_1_depthwise_depthwise" convolution_param { num_output: 864 bias_term: false group: 864 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "Openpose_MConv_Stage1_L1_1_depthwise_depthwise" type: "Convolution" bottom: "feat_concat" top: "Openpose_MConv_Stage1_L1_1_depthwise_depthwise" convolution_param { num_output: 864 bias_term: false group: 864 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "Openpose_MConv_Stage1_L2_1_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_1_depthwise_depthwise" top: "Openpose_MConv_Stage1_L2_1_pointwise_Conv2D" convolution_param { num_output: 64 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L1_1_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_1_depthwise_depthwise" top: "Openpose_MConv_Stage1_L1_1_pointwise_Conv2D" convolution_param { num_output: 64 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L2_1_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L1_1_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L2_1_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L1_1_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L2_2_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_1_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_2_depthwise_depthwise" convolution_param { num_output: 64 bias_term: false group: 64 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "Openpose_MConv_Stage1_L1_2_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_1_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_2_depthwise_depthwise" convolution_param { num_output: 64 bias_term: false group: 64 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "Openpose_MConv_Stage1_L2_2_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_2_depthwise_depthwise" top: "Openpose_MConv_Stage1_L2_2_pointwise_Conv2D" convolution_param { num_output: 64 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L1_2_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_2_depthwise_depthwise" top: "Openpose_MConv_Stage1_L1_2_pointwise_Conv2D" convolution_param { num_output: 64 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L2_2_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L1_2_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L2_2_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L1_2_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L2_3_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_2_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_3_depthwise_depthwise" convolution_param { num_output: 64 bias_term: false group: 64 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "Openpose_MConv_Stage1_L1_3_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_2_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_3_depthwise_depthwise" convolution_param { num_output: 64 bias_term: false group: 64 stride: 1 pad_h: 1 pad_w: 1 kernel_h: 3 kernel_w: 3 } } layer { name: "Openpose_MConv_Stage1_L2_3_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_3_depthwise_depthwise" top: "Openpose_MConv_Stage1_L2_3_pointwise_Conv2D" convolution_param { num_output: 64 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L1_3_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_3_depthwise_depthwise" top: "Openpose_MConv_Stage1_L1_3_pointwise_Conv2D" convolution_param { num_output: 64 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L2_3_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L1_3_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L2_3_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L1_3_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L2_4_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_3_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_4_depthwise_depthwise" convolution_param { num_output: 64 bias_term: false group: 64 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L1_4_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_3_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_4_depthwise_depthwise" convolution_param { num_output: 64 bias_term: false group: 64 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L2_4_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_4_depthwise_depthwise" top: "Openpose_MConv_Stage1_L2_4_pointwise_Conv2D" convolution_param { num_output: 256 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L1_4_pointwise_Conv2D" type: "Convolution" bottom: "Openpose_MConv_Stage1_L1_4_depthwise_depthwise" top: "Openpose_MConv_Stage1_L1_4_pointwise_Conv2D" convolution_param { num_output: 256 bias_term: false group: 1 stride: 1 pad_h: 0 pad_w: 0 kernel_h: 1 kernel_w: 1 } } layer { name: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L2_4_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm" type: "BatchNorm" bottom: "Openpose_MConv_Stage1_L1_4_pointwise_Conv2D" top: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm" batch_norm_param { use_global_stats: true eps: 0.001 } } layer { name: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm_scale" type: "Scale" bottom: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm" scale_param { bias_term: true } } layer { name: "Openpose_MConv_Stage1_L2_4_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L1_4_pointwise_Relu" type: "ReLU" bottom: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L1_4_pointwise_BatchNorm_FusedBatchNorm" } layer { name: "Openpose_MConv_Stage1_L2_5_depthwise_depthwise" type: "Convolution" bottom: "Openpose_MConv_Stage1_L2_4_pointwise_BatchNorm_FusedBatchNorm" top: "Openpose_MConv_Stage1_L2_5_depthwise_depthwise" convolution_param { num_output: 256 bias_term: false group: 256 stride I0726 16:15:38.268064 10279 layer_factory.hpp:77] Creating layer Placeholder I0726 16:15:38.268103 10279 net.cpp:100] Creating Layer Placeholder I0726 16:15:38.268113 10279 net.cpp:408] Placeholder -> Placeholder I0726 16:15:38.268151 10279 net.cpp:150] Setting up Placeholder I0726 16:15:38.268162 10279 net.cpp:157] Top shape: 1 3 368 432 (476928) I0726 16:15:38.268167 10279 net.cpp:165] Memory required for data: 1907712 I0726 16:15:38.268172 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_0_Conv2D I0726 16:15:38.268182 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_0_Conv2D I0726 16:15:38.268187 10279 net.cpp:434] MobilenetV1_Conv2d_0_Conv2D <- Placeholder I0726 16:15:38.268194 10279 net.cpp:408] MobilenetV1_Conv2d_0_Conv2D -> MobilenetV1_Conv2d_0_Conv2D I0726 16:15:38.268225 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_0_Conv2D I0726 16:15:38.268234 10279 net.cpp:157] Top shape: 1 24 183 215 (944280) I0726 16:15:38.268239 10279 net.cpp:165] Memory required for data: 5684832 I0726 16:15:38.268247 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268257 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268262 10279 net.cpp:434] MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_0_Conv2D I0726 16:15:38.268268 10279 net.cpp:408] MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268313 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268324 10279 net.cpp:157] Top shape: 1 24 183 215 (944280) I0726 16:15:38.268328 10279 net.cpp:165] Memory required for data: 9461952 I0726 16:15:38.268339 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268348 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268353 10279 net.cpp:434] MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268360 10279 net.cpp:395] MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.268373 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268448 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268460 10279 net.cpp:157] Top shape: 1 24 183 215 (944280) I0726 16:15:38.268465 10279 net.cpp:165] Memory required for data: 13239072 I0726 16:15:38.268473 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_0_Relu I0726 16:15:38.268482 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_0_Relu I0726 16:15:38.268487 10279 net.cpp:434] MobilenetV1_Conv2d_0_Relu <- MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268493 10279 net.cpp:395] MobilenetV1_Conv2d_0_Relu -> MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.268501 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_0_Relu I0726 16:15:38.268507 10279 net.cpp:157] Top shape: 1 24 183 215 (944280) I0726 16:15:38.268512 10279 net.cpp:165] Memory required for data: 17016192 I0726 16:15:38.268517 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_1_depthwise_depthwise I0726 16:15:38.268525 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_1_depthwise_depthwise I0726 16:15:38.268530 10279 net.cpp:434] MobilenetV1_Conv2d_1_depthwise_depthwise <- MobilenetV1_Conv2d_0_BatchNorm_FusedBatchNorm I0726 16:15:38.268537 10279 net.cpp:408] MobilenetV1_Conv2d_1_depthwise_depthwise -> MobilenetV1_Conv2d_1_depthwise_depthwise I0726 16:15:38.268553 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_1_depthwise_depthwise I0726 16:15:38.268559 10279 net.cpp:157] Top shape: 1 24 183 215 (944280) I0726 16:15:38.268564 10279 net.cpp:165] Memory required for data: 20793312 I0726 16:15:38.268570 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_1_pointwise_Conv2D I0726 16:15:38.268577 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_1_pointwise_Conv2D I0726 16:15:38.268582 10279 net.cpp:434] MobilenetV1_Conv2d_1_pointwise_Conv2D <- MobilenetV1_Conv2d_1_depthwise_depthwise I0726 16:15:38.268587 10279 net.cpp:408] MobilenetV1_Conv2d_1_pointwise_Conv2D -> MobilenetV1_Conv2d_1_pointwise_Conv2D I0726 16:15:38.268602 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_1_pointwise_Conv2D I0726 16:15:38.268609 10279 net.cpp:157] Top shape: 1 48 183 215 (1888560) I0726 16:15:38.268612 10279 net.cpp:165] Memory required for data: 28347552 I0726 16:15:38.268618 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268625 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268630 10279 net.cpp:434] MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_1_pointwise_Conv2D I0726 16:15:38.268636 10279 net.cpp:408] MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268682 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268689 10279 net.cpp:157] Top shape: 1 48 183 215 (1888560) I0726 16:15:38.268692 10279 net.cpp:165] Memory required for data: 35901792 I0726 16:15:38.268700 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268707 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268712 10279 net.cpp:434] MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268716 10279 net.cpp:395] MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.268725 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268775 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.268782 10279 net.cpp:157] Top shape: 1 48 183 215 (1888560) I0726 16:15:38.268786 10279 net.cpp:165] Memory required for data: 43456032 I0726 16:15:38.268791 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_1_pointwise_Relu I0726 16:15:38.268797 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_1_pointwise_Relu I0726 16:15:38.268801 10279 net.cpp:434] MobilenetV1_Conv2d_1_pointwise_Relu <- MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268805 10279 net.cpp:395] MobilenetV1_Conv2d_1_pointwise_Relu -> MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.268822 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_1_pointwise_Relu I0726 16:15:38.268828 10279 net.cpp:157] Top shape: 1 48 183 215 (1888560) I0726 16:15:38.268832 10279 net.cpp:165] Memory required for data: 51010272 I0726 16:15:38.268836 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_2_depthwise_depthwise I0726 16:15:38.268843 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_2_depthwise_depthwise I0726 16:15:38.268848 10279 net.cpp:434] MobilenetV1_Conv2d_2_depthwise_depthwise <- MobilenetV1_Conv2d_1_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268854 10279 net.cpp:408] MobilenetV1_Conv2d_2_depthwise_depthwise -> MobilenetV1_Conv2d_2_depthwise_depthwise I0726 16:15:38.268870 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_2_depthwise_depthwise I0726 16:15:38.268877 10279 net.cpp:157] Top shape: 1 48 91 107 (467376) I0726 16:15:38.268882 10279 net.cpp:165] Memory required for data: 52879776 I0726 16:15:38.268887 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_2_pointwise_Conv2D I0726 16:15:38.268893 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_2_pointwise_Conv2D I0726 16:15:38.268898 10279 net.cpp:434] MobilenetV1_Conv2d_2_pointwise_Conv2D <- MobilenetV1_Conv2d_2_depthwise_depthwise I0726 16:15:38.268904 10279 net.cpp:408] MobilenetV1_Conv2d_2_pointwise_Conv2D -> MobilenetV1_Conv2d_2_pointwise_Conv2D I0726 16:15:38.268919 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_2_pointwise_Conv2D I0726 16:15:38.268926 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.268930 10279 net.cpp:165] Memory required for data: 56618784 I0726 16:15:38.268936 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268944 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268949 10279 net.cpp:434] MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_2_pointwise_Conv2D I0726 16:15:38.268954 10279 net.cpp:408] MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268977 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.268985 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.268990 10279 net.cpp:165] Memory required for data: 60357792 I0726 16:15:38.269001 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269008 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269013 10279 net.cpp:434] MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269019 10279 net.cpp:395] MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.269031 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269057 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269065 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269069 10279 net.cpp:165] Memory required for data: 64096800 I0726 16:15:38.269076 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_2_pointwise_Relu I0726 16:15:38.269083 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_2_pointwise_Relu I0726 16:15:38.269088 10279 net.cpp:434] MobilenetV1_Conv2d_2_pointwise_Relu <- MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269094 10279 net.cpp:395] MobilenetV1_Conv2d_2_pointwise_Relu -> MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.269100 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_2_pointwise_Relu I0726 16:15:38.269106 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269110 10279 net.cpp:165] Memory required for data: 67835808 I0726 16:15:38.269114 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_depthwise_depthwise I0726 16:15:38.269122 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_3_depthwise_depthwise I0726 16:15:38.269126 10279 net.cpp:434] MobilenetV1_Conv2d_3_depthwise_depthwise <- MobilenetV1_Conv2d_2_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269132 10279 net.cpp:408] MobilenetV1_Conv2d_3_depthwise_depthwise -> MobilenetV1_Conv2d_3_depthwise_depthwise I0726 16:15:38.269147 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_3_depthwise_depthwise I0726 16:15:38.269155 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269158 10279 net.cpp:165] Memory required for data: 71574816 I0726 16:15:38.269165 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_pointwise_Conv2D I0726 16:15:38.269174 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_3_pointwise_Conv2D I0726 16:15:38.269179 10279 net.cpp:434] MobilenetV1_Conv2d_3_pointwise_Conv2D <- MobilenetV1_Conv2d_3_depthwise_depthwise I0726 16:15:38.269186 10279 net.cpp:408] MobilenetV1_Conv2d_3_pointwise_Conv2D -> MobilenetV1_Conv2d_3_pointwise_Conv2D I0726 16:15:38.269206 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_3_pointwise_Conv2D I0726 16:15:38.269214 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269218 10279 net.cpp:165] Memory required for data: 75313824 I0726 16:15:38.269224 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269232 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269237 10279 net.cpp:434] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_3_pointwise_Conv2D I0726 16:15:38.269243 10279 net.cpp:408] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269263 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269271 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269275 10279 net.cpp:165] Memory required for data: 79052832 I0726 16:15:38.269284 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269290 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269295 10279 net.cpp:434] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269301 10279 net.cpp:395] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.269312 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269338 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.269347 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269351 10279 net.cpp:165] Memory required for data: 82791840 I0726 16:15:38.269358 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_pointwise_Relu I0726 16:15:38.269366 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_3_pointwise_Relu I0726 16:15:38.269371 10279 net.cpp:434] MobilenetV1_Conv2d_3_pointwise_Relu <- MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269376 10279 net.cpp:395] MobilenetV1_Conv2d_3_pointwise_Relu -> MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.269383 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_3_pointwise_Relu I0726 16:15:38.269389 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269393 10279 net.cpp:165] Memory required for data: 86530848 I0726 16:15:38.269397 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split I0726 16:15:38.269405 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split I0726 16:15:38.269410 10279 net.cpp:434] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split <- MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.269417 10279 net.cpp:408] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split -> MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split_0 I0726 16:15:38.269425 10279 net.cpp:408] MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split -> MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split_1 I0726 16:15:38.269435 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split I0726 16:15:38.269441 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269446 10279 net.cpp:157] Top shape: 1 96 91 107 (934752) I0726 16:15:38.269450 10279 net.cpp:165] Memory required for data: 94008864 I0726 16:15:38.269454 10279 layer_factory.hpp:77] Creating layer Conv2d_3_pool I0726 16:15:38.269462 10279 net.cpp:100] Creating Layer Conv2d_3_pool I0726 16:15:38.269466 10279 net.cpp:434] Conv2d_3_pool <- MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split_0 I0726 16:15:38.269474 10279 net.cpp:408] Conv2d_3_pool -> Conv2d_3_pool I0726 16:15:38.269488 10279 net.cpp:150] Setting up Conv2d_3_pool I0726 16:15:38.269496 10279 net.cpp:157] Top shape: 1 96 46 54 (238464) I0726 16:15:38.269500 10279 net.cpp:165] Memory required for data: 94962720 I0726 16:15:38.269505 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_4_depthwise_depthwise I0726 16:15:38.269512 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_4_depthwise_depthwise I0726 16:15:38.269517 10279 net.cpp:434] MobilenetV1_Conv2d_4_depthwise_depthwise <- MobilenetV1_Conv2d_3_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_3_pointwise_Relu_0_split_1 I0726 16:15:38.269523 10279 net.cpp:408] MobilenetV1_Conv2d_4_depthwise_depthwise -> MobilenetV1_Conv2d_4_depthwise_depthwise I0726 16:15:38.269541 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_4_depthwise_depthwise I0726 16:15:38.269546 10279 net.cpp:157] Top shape: 1 96 45 53 (228960) I0726 16:15:38.269551 10279 net.cpp:165] Memory required for data: 95878560 I0726 16:15:38.277405 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_4_pointwise_Conv2D I0726 16:15:38.277427 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_4_pointwise_Conv2D I0726 16:15:38.277434 10279 net.cpp:434] MobilenetV1_Conv2d_4_pointwise_Conv2D <- MobilenetV1_Conv2d_4_depthwise_depthwise I0726 16:15:38.277443 10279 net.cpp:408] MobilenetV1_Conv2d_4_pointwise_Conv2D -> MobilenetV1_Conv2d_4_pointwise_Conv2D I0726 16:15:38.277483 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_4_pointwise_Conv2D I0726 16:15:38.277494 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277499 10279 net.cpp:165] Memory required for data: 97710240 I0726 16:15:38.277505 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277513 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277518 10279 net.cpp:434] MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_4_pointwise_Conv2D I0726 16:15:38.277525 10279 net.cpp:408] MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277544 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277550 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277554 10279 net.cpp:165] Memory required for data: 99541920 I0726 16:15:38.277562 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277571 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277575 10279 net.cpp:434] MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277582 10279 net.cpp:395] MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.277595 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277612 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277621 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277624 10279 net.cpp:165] Memory required for data: 101373600 I0726 16:15:38.277637 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_4_pointwise_Relu I0726 16:15:38.277645 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_4_pointwise_Relu I0726 16:15:38.277650 10279 net.cpp:434] MobilenetV1_Conv2d_4_pointwise_Relu <- MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277657 10279 net.cpp:395] MobilenetV1_Conv2d_4_pointwise_Relu -> MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.277664 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_4_pointwise_Relu I0726 16:15:38.277670 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277675 10279 net.cpp:165] Memory required for data: 103205280 I0726 16:15:38.277679 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_5_depthwise_depthwise I0726 16:15:38.277688 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_5_depthwise_depthwise I0726 16:15:38.277693 10279 net.cpp:434] MobilenetV1_Conv2d_5_depthwise_depthwise <- MobilenetV1_Conv2d_4_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277699 10279 net.cpp:408] MobilenetV1_Conv2d_5_depthwise_depthwise -> MobilenetV1_Conv2d_5_depthwise_depthwise I0726 16:15:38.277716 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_5_depthwise_depthwise I0726 16:15:38.277724 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277727 10279 net.cpp:165] Memory required for data: 105036960 I0726 16:15:38.277734 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_5_pointwise_Conv2D I0726 16:15:38.277740 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_5_pointwise_Conv2D I0726 16:15:38.277746 10279 net.cpp:434] MobilenetV1_Conv2d_5_pointwise_Conv2D <- MobilenetV1_Conv2d_5_depthwise_depthwise I0726 16:15:38.277752 10279 net.cpp:408] MobilenetV1_Conv2d_5_pointwise_Conv2D -> MobilenetV1_Conv2d_5_pointwise_Conv2D I0726 16:15:38.277793 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_5_pointwise_Conv2D I0726 16:15:38.277803 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277807 10279 net.cpp:165] Memory required for data: 106868640 I0726 16:15:38.277814 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277822 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277827 10279 net.cpp:434] MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_5_pointwise_Conv2D I0726 16:15:38.277834 10279 net.cpp:408] MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277851 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277858 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277863 10279 net.cpp:165] Memory required for data: 108700320 I0726 16:15:38.277869 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277878 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277884 10279 net.cpp:434] MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277889 10279 net.cpp:395] MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.277900 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277917 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.277925 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277930 10279 net.cpp:165] Memory required for data: 110532000 I0726 16:15:38.277936 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_5_pointwise_Relu I0726 16:15:38.277943 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_5_pointwise_Relu I0726 16:15:38.277948 10279 net.cpp:434] MobilenetV1_Conv2d_5_pointwise_Relu <- MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.277954 10279 net.cpp:395] MobilenetV1_Conv2d_5_pointwise_Relu -> MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.277961 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_5_pointwise_Relu I0726 16:15:38.277967 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.277972 10279 net.cpp:165] Memory required for data: 112363680 I0726 16:15:38.277976 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_6_depthwise_depthwise I0726 16:15:38.277992 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_6_depthwise_depthwise I0726 16:15:38.277997 10279 net.cpp:434] MobilenetV1_Conv2d_6_depthwise_depthwise <- MobilenetV1_Conv2d_5_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.278003 10279 net.cpp:408] MobilenetV1_Conv2d_6_depthwise_depthwise -> MobilenetV1_Conv2d_6_depthwise_depthwise I0726 16:15:38.278020 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_6_depthwise_depthwise I0726 16:15:38.278028 10279 net.cpp:157] Top shape: 1 192 45 53 (457920) I0726 16:15:38.278033 10279 net.cpp:165] Memory required for data: 114195360 I0726 16:15:38.278038 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_6_pointwise_Conv2D I0726 16:15:38.278046 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_6_pointwise_Conv2D I0726 16:15:38.278051 10279 net.cpp:434] MobilenetV1_Conv2d_6_pointwise_Conv2D <- MobilenetV1_Conv2d_6_depthwise_depthwise I0726 16:15:38.278057 10279 net.cpp:408] MobilenetV1_Conv2d_6_pointwise_Conv2D -> MobilenetV1_Conv2d_6_pointwise_Conv2D I0726 16:15:38.281857 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_6_pointwise_Conv2D I0726 16:15:38.281893 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.281905 10279 net.cpp:165] Memory required for data: 117858720 I0726 16:15:38.281918 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.281934 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.281947 10279 net.cpp:434] MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_6_pointwise_Conv2D I0726 16:15:38.281965 10279 net.cpp:408] MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282002 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282017 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282021 10279 net.cpp:165] Memory required for data: 121522080 I0726 16:15:38.282032 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282044 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282052 10279 net.cpp:434] MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282061 10279 net.cpp:395] MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.282078 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282101 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282111 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282117 10279 net.cpp:165] Memory required for data: 125185440 I0726 16:15:38.282138 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_6_pointwise_Relu I0726 16:15:38.282146 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_6_pointwise_Relu I0726 16:15:38.282151 10279 net.cpp:434] MobilenetV1_Conv2d_6_pointwise_Relu <- MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282160 10279 net.cpp:395] MobilenetV1_Conv2d_6_pointwise_Relu -> MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.282167 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_6_pointwise_Relu I0726 16:15:38.282174 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282179 10279 net.cpp:165] Memory required for data: 128848800 I0726 16:15:38.282184 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_depthwise_depthwise I0726 16:15:38.282193 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_7_depthwise_depthwise I0726 16:15:38.282199 10279 net.cpp:434] MobilenetV1_Conv2d_7_depthwise_depthwise <- MobilenetV1_Conv2d_6_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282207 10279 net.cpp:408] MobilenetV1_Conv2d_7_depthwise_depthwise -> MobilenetV1_Conv2d_7_depthwise_depthwise I0726 16:15:38.282225 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_7_depthwise_depthwise I0726 16:15:38.282233 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282238 10279 net.cpp:165] Memory required for data: 132512160 I0726 16:15:38.282244 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_pointwise_Conv2D I0726 16:15:38.282264 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_7_pointwise_Conv2D I0726 16:15:38.282271 10279 net.cpp:434] MobilenetV1_Conv2d_7_pointwise_Conv2D <- MobilenetV1_Conv2d_7_depthwise_depthwise I0726 16:15:38.282280 10279 net.cpp:408] MobilenetV1_Conv2d_7_pointwise_Conv2D -> MobilenetV1_Conv2d_7_pointwise_Conv2D I0726 16:15:38.282452 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_7_pointwise_Conv2D I0726 16:15:38.282464 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282469 10279 net.cpp:165] Memory required for data: 136175520 I0726 16:15:38.282475 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282483 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282490 10279 net.cpp:434] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_7_pointwise_Conv2D I0726 16:15:38.282500 10279 net.cpp:408] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282519 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282527 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282531 10279 net.cpp:165] Memory required for data: 139838880 I0726 16:15:38.282539 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282546 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282553 10279 net.cpp:434] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282562 10279 net.cpp:395] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.282575 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282595 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.282605 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282611 10279 net.cpp:165] Memory required for data: 143502240 I0726 16:15:38.282619 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_pointwise_Relu I0726 16:15:38.282629 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_7_pointwise_Relu I0726 16:15:38.282634 10279 net.cpp:434] MobilenetV1_Conv2d_7_pointwise_Relu <- MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282642 10279 net.cpp:395] MobilenetV1_Conv2d_7_pointwise_Relu -> MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.282651 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_7_pointwise_Relu I0726 16:15:38.282660 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282665 10279 net.cpp:165] Memory required for data: 147165600 I0726 16:15:38.282670 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split I0726 16:15:38.282680 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split I0726 16:15:38.282685 10279 net.cpp:434] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split <- MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.282694 10279 net.cpp:408] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split -> MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split_0 I0726 16:15:38.282704 10279 net.cpp:408] MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split -> MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split_1 I0726 16:15:38.282716 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split I0726 16:15:38.282724 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282730 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282737 10279 net.cpp:165] Memory required for data: 154492320 I0726 16:15:38.282742 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_8_depthwise_depthwise I0726 16:15:38.282752 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_8_depthwise_depthwise I0726 16:15:38.282759 10279 net.cpp:434] MobilenetV1_Conv2d_8_depthwise_depthwise <- MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split_0 I0726 16:15:38.282768 10279 net.cpp:408] MobilenetV1_Conv2d_8_depthwise_depthwise -> MobilenetV1_Conv2d_8_depthwise_depthwise I0726 16:15:38.282788 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_8_depthwise_depthwise I0726 16:15:38.282799 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.282804 10279 net.cpp:165] Memory required for data: 158155680 I0726 16:15:38.282811 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_8_pointwise_Conv2D I0726 16:15:38.282843 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_8_pointwise_Conv2D I0726 16:15:38.282853 10279 net.cpp:434] MobilenetV1_Conv2d_8_pointwise_Conv2D <- MobilenetV1_Conv2d_8_depthwise_depthwise I0726 16:15:38.282862 10279 net.cpp:408] MobilenetV1_Conv2d_8_pointwise_Conv2D -> MobilenetV1_Conv2d_8_pointwise_Conv2D I0726 16:15:38.283036 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_8_pointwise_Conv2D I0726 16:15:38.283051 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283056 10279 net.cpp:165] Memory required for data: 161819040 I0726 16:15:38.283064 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283074 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283082 10279 net.cpp:434] MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_8_pointwise_Conv2D I0726 16:15:38.283090 10279 net.cpp:408] MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283110 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283119 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283125 10279 net.cpp:165] Memory required for data: 165482400 I0726 16:15:38.283135 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283144 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283150 10279 net.cpp:434] MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283159 10279 net.cpp:395] MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.283172 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283195 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283205 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283210 10279 net.cpp:165] Memory required for data: 169145760 I0726 16:15:38.283218 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_8_pointwise_Relu I0726 16:15:38.283228 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_8_pointwise_Relu I0726 16:15:38.283234 10279 net.cpp:434] MobilenetV1_Conv2d_8_pointwise_Relu <- MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283242 10279 net.cpp:395] MobilenetV1_Conv2d_8_pointwise_Relu -> MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.283252 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_8_pointwise_Relu I0726 16:15:38.283260 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283265 10279 net.cpp:165] Memory required for data: 172809120 I0726 16:15:38.283272 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_9_depthwise_depthwise I0726 16:15:38.283282 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_9_depthwise_depthwise I0726 16:15:38.283288 10279 net.cpp:434] MobilenetV1_Conv2d_9_depthwise_depthwise <- MobilenetV1_Conv2d_8_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283296 10279 net.cpp:408] MobilenetV1_Conv2d_9_depthwise_depthwise -> MobilenetV1_Conv2d_9_depthwise_depthwise I0726 16:15:38.283325 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_9_depthwise_depthwise I0726 16:15:38.283335 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283341 10279 net.cpp:165] Memory required for data: 176472480 I0726 16:15:38.283349 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_9_pointwise_Conv2D I0726 16:15:38.283360 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_9_pointwise_Conv2D I0726 16:15:38.283366 10279 net.cpp:434] MobilenetV1_Conv2d_9_pointwise_Conv2D <- MobilenetV1_Conv2d_9_depthwise_depthwise I0726 16:15:38.283375 10279 net.cpp:408] MobilenetV1_Conv2d_9_pointwise_Conv2D -> MobilenetV1_Conv2d_9_pointwise_Conv2D I0726 16:15:38.283547 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_9_pointwise_Conv2D I0726 16:15:38.283560 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283566 10279 net.cpp:165] Memory required for data: 180135840 I0726 16:15:38.283574 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283584 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283591 10279 net.cpp:434] MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_9_pointwise_Conv2D I0726 16:15:38.283601 10279 net.cpp:408] MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283620 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283628 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283634 10279 net.cpp:165] Memory required for data: 183799200 I0726 16:15:38.283655 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283668 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283674 10279 net.cpp:434] MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283680 10279 net.cpp:395] MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.283694 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283715 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.283723 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283730 10279 net.cpp:165] Memory required for data: 187462560 I0726 16:15:38.283740 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_9_pointwise_Relu I0726 16:15:38.283748 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_9_pointwise_Relu I0726 16:15:38.283756 10279 net.cpp:434] MobilenetV1_Conv2d_9_pointwise_Relu <- MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283763 10279 net.cpp:395] MobilenetV1_Conv2d_9_pointwise_Relu -> MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.283772 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_9_pointwise_Relu I0726 16:15:38.283782 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283787 10279 net.cpp:165] Memory required for data: 191125920 I0726 16:15:38.283793 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_10_depthwise_depthwise I0726 16:15:38.283802 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_10_depthwise_depthwise I0726 16:15:38.283809 10279 net.cpp:434] MobilenetV1_Conv2d_10_depthwise_depthwise <- MobilenetV1_Conv2d_9_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.283818 10279 net.cpp:408] MobilenetV1_Conv2d_10_depthwise_depthwise -> MobilenetV1_Conv2d_10_depthwise_depthwise I0726 16:15:38.283838 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_10_depthwise_depthwise I0726 16:15:38.283848 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.283852 10279 net.cpp:165] Memory required for data: 194789280 I0726 16:15:38.283859 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_10_pointwise_Conv2D I0726 16:15:38.283865 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_10_pointwise_Conv2D I0726 16:15:38.283872 10279 net.cpp:434] MobilenetV1_Conv2d_10_pointwise_Conv2D <- MobilenetV1_Conv2d_10_depthwise_depthwise I0726 16:15:38.283881 10279 net.cpp:408] MobilenetV1_Conv2d_10_pointwise_Conv2D -> MobilenetV1_Conv2d_10_pointwise_Conv2D I0726 16:15:38.284054 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_10_pointwise_Conv2D I0726 16:15:38.284066 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284072 10279 net.cpp:165] Memory required for data: 198452640 I0726 16:15:38.284080 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284090 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284097 10279 net.cpp:434] MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_10_pointwise_Conv2D I0726 16:15:38.284106 10279 net.cpp:408] MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284127 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284135 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284142 10279 net.cpp:165] Memory required for data: 202116000 I0726 16:15:38.284158 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284168 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284175 10279 net.cpp:434] MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284184 10279 net.cpp:395] MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.284198 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284219 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284229 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284235 10279 net.cpp:165] Memory required for data: 205779360 I0726 16:15:38.284243 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_10_pointwise_Relu I0726 16:15:38.284253 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_10_pointwise_Relu I0726 16:15:38.284260 10279 net.cpp:434] MobilenetV1_Conv2d_10_pointwise_Relu <- MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284267 10279 net.cpp:395] MobilenetV1_Conv2d_10_pointwise_Relu -> MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.284276 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_10_pointwise_Relu I0726 16:15:38.284284 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284291 10279 net.cpp:165] Memory required for data: 209442720 I0726 16:15:38.284296 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_11_depthwise_depthwise I0726 16:15:38.284307 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_11_depthwise_depthwise I0726 16:15:38.284313 10279 net.cpp:434] MobilenetV1_Conv2d_11_depthwise_depthwise <- MobilenetV1_Conv2d_10_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284322 10279 net.cpp:408] MobilenetV1_Conv2d_11_depthwise_depthwise -> MobilenetV1_Conv2d_11_depthwise_depthwise I0726 16:15:38.284343 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_11_depthwise_depthwise I0726 16:15:38.284353 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284358 10279 net.cpp:165] Memory required for data: 213106080 I0726 16:15:38.284365 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_11_pointwise_Conv2D I0726 16:15:38.284375 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_11_pointwise_Conv2D I0726 16:15:38.284382 10279 net.cpp:434] MobilenetV1_Conv2d_11_pointwise_Conv2D <- MobilenetV1_Conv2d_11_depthwise_depthwise I0726 16:15:38.284391 10279 net.cpp:408] MobilenetV1_Conv2d_11_pointwise_Conv2D -> MobilenetV1_Conv2d_11_pointwise_Conv2D I0726 16:15:38.284569 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_11_pointwise_Conv2D I0726 16:15:38.284580 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284586 10279 net.cpp:165] Memory required for data: 216769440 I0726 16:15:38.284595 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284605 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284611 10279 net.cpp:434] MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm <- MobilenetV1_Conv2d_11_pointwise_Conv2D I0726 16:15:38.284621 10279 net.cpp:408] MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm -> MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284641 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284649 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284656 10279 net.cpp:165] Memory required for data: 220432800 I0726 16:15:38.284665 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284675 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284682 10279 net.cpp:434] MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale <- MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284689 10279 net.cpp:395] MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale -> MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.284703 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284723 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm_scale I0726 16:15:38.284732 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284737 10279 net.cpp:165] Memory required for data: 224096160 I0726 16:15:38.284746 10279 layer_factory.hpp:77] Creating layer MobilenetV1_Conv2d_11_pointwise_Relu I0726 16:15:38.284754 10279 net.cpp:100] Creating Layer MobilenetV1_Conv2d_11_pointwise_Relu I0726 16:15:38.284760 10279 net.cpp:434] MobilenetV1_Conv2d_11_pointwise_Relu <- MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284770 10279 net.cpp:395] MobilenetV1_Conv2d_11_pointwise_Relu -> MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm (in-place) I0726 16:15:38.284777 10279 net.cpp:150] Setting up MobilenetV1_Conv2d_11_pointwise_Relu I0726 16:15:38.284785 10279 net.cpp:157] Top shape: 1 384 45 53 (915840) I0726 16:15:38.284791 10279 net.cpp:165] Memory required for data: 227759520 I0726 16:15:38.284797 10279 layer_factory.hpp:77] Creating layer feat_concat I0726 16:15:38.284807 10279 net.cpp:100] Creating Layer feat_concat I0726 16:15:38.284813 10279 net.cpp:434] feat_concat <- Conv2d_3_pool I0726 16:15:38.284821 10279 net.cpp:434] feat_concat <- MobilenetV1_Conv2d_7_pointwise_BatchNorm_FusedBatchNorm_MobilenetV1_Conv2d_7_pointwise_Relu_0_split_1 I0726 16:15:38.284829 10279 net.cpp:434] feat_concat <- MobilenetV1_Conv2d_11_pointwise_BatchNorm_FusedBatchNorm I0726 16:15:38.284837 10279 net.cpp:408] feat_concat -> feat_concat F0726 16:15:38.284852 10279 concat_layer.cpp:42] Check failed: top_shape[j] == bottom[i]->shape(j) (46 vs. 45) All inputs must have the same shape, except at concat_axis. *** Check failure stack trace: ***

iChiaGuo avatar Jul 26 '18 10:07 iChiaGuo

Hi @iChiaGuo , the crash of the conversion from Tensorflow mobilenet to caffe is a known issue, you can refer the test to get it. And we have no idea about the crash. We will figure it out when we have bandwidth. Thanks.

kitstar avatar Jul 27 '18 09:07 kitstar

Hi, I am facing a different issue with Openpose using VGG base network. Can you suggest what the issue might be. Please find the error I receive below.

Traceback (most recent call last): File "/usr/local/bin/mmconvert", line 11, in sys.exit(_main()) File "/usr/local/lib/python3.5/dist-packages/mmdnn/conversion/_script/convert.py", line 102, in _main ret = convertToIR._convert(ir_args) File "/usr/local/lib/python3.5/dist-packages/mmdnn/conversion/_script/convertToIR.py", line 117, in _convert parser.run(args.dstPath) File "/usr/local/lib/python3.5/dist-packages/mmdnn/conversion/common/DataStructure/parser.py", line 22, in run self.gen_IR() File "/usr/local/lib/python3.5/dist-packages/mmdnn/conversion/tensorflow/tensorflow_frozenparser.py", line 360, in gen_IR func(current_node) File "/usr/local/lib/python3.5/dist-packages/mmdnn/conversion/tensorflow/tensorflow_frozenparser.py", line 668, in rename_Sub if scopes[-2].startswith('Assign') or scopes[-1].startswith('Assign'): IndexError: list index out of range

shreyasrajesh avatar Aug 22 '18 13:08 shreyasrajesh

Hi @shreyasrajesh ,Thanks your issue, we have fixed it!

namizzz avatar Aug 23 '18 08:08 namizzz

Hi I also got the tensorflowtocaffe issue with the error log 'All inputs must have the same shape, except at concat_axis'. Have you already fixed it?

jycccccccc avatar Oct 15 '18 04:10 jycccccccc

嗨@shreyasrajesh,谢谢你的问题,我们已经解决了!

How did you solve it?thanks

starsky68 avatar Dec 28 '21 03:12 starsky68