VPGNet icon indicating copy to clipboard operation
VPGNet copied to clipboard

difference from the standard caffe

Open wishinger-li opened this issue 6 years ago • 8 comments

Is the caffe used in this project different from the standard caffe?(do your add some special layers?) can i replace the caffe in this project with the newest caffe?

wishinger-li avatar Apr 04 '18 05:04 wishinger-li

@wishinger-li There are customized layers such as data layer and tiling layer.

SeokjuLee avatar Apr 04 '18 06:04 SeokjuLee

upsampling layer i have added .i will release it

hexiangquan avatar Apr 19 '18 05:04 hexiangquan

@hexiangquan ,have you release it?

daixiaogang avatar Apr 23 '18 12:04 daixiaogang

@SeokjuLee @wishinger-li Did you successfully compile with this caffe? Why do I always fail to compile, the error like that: ./include/caffe/layers/pooling_layer.hpp:54:3: error: ‘PoolingParameter_RoundMode’ does not name a type PoolingParameter_RoundMode round_mode_; ^ Makefile:524: recipe for target '.build_release/src/caffe/layers/cudnn_pooling_layer.o' failed make: *** [.build_release/src/caffe/layers/cudnn_pooling_layer.o] Error 1 who can help me?

screamdw avatar Mar 13 '19 12:03 screamdw

@screamdw Did you uncomment # USE_CUDNN := 1 in Makefile.config?

SeokjuLee avatar Mar 13 '19 13:03 SeokjuLee

@SeokjuLee can you help me? I've compiled it successfully,and the make_lmdb.sh runs successfully.but when I run train.sh ,There were some errors: ➜ vpgnet-novp bash train.sh train.sh: 行 1: 6452 段错误 (核心已转储) ../../build/tools/caffe train --solver=./solver.prototxt >> ./output/output.log 2>&1 The contents of this file(output.log) are as follows: I0314 15:42:40.633046 6452 caffe.cpp:183] Using GPUs 0 I0314 15:42:40.858393 6452 solver.cpp:54] Initializing solver from parameters: test_iter: 20 test_interval: 100 base_lr: 0.005 display: 10 max_iter: 100000 lr_policy: "step" gamma: 0.1 momentum: 0.9 weight_decay: 0.0005 stepsize: 100000 snapshot: 2500 snapshot_prefix: "./snapshots/split" solver_mode: GPU device_id: 0 test_compute_loss: true net: "./train_val.prototxt" I0314 15:42:40.858433 6452 solver.cpp:96] Creating training net from net file: ./train_val.prototxt I0314 15:42:40.858779 6452 net.cpp:339] The NetState phase (0) differed from the phase (1) specified by a rule in layer data I0314 15:42:40.858803 6452 net.cpp:339] The NetState phase (0) differed from the phase (1) specified by a rule in layer pixel-acc I0314 15:42:40.858808 6452 net.cpp:339] The NetState phase (0) differed from the phase (1) specified by a rule in layer type-acc I0314 15:42:40.858986 6452 net.cpp:50] Initializing net from parameters: name: "VPGNet-noVP" state { phase: TRAIN } layer { name: "data" type: "DriveData" top: "data" top: "label" top: "type" include { phase: TRAIN } transform_param { mean_file: "./driving_mean_train.binaryproto" } data_param { source: "./LMDB_train" batch_size: 24 backend: LMDB } drive_data_param { shrink_prob_factor: 1 unrecognize_factor: 0 crop_num: 1 random_crop_ratio: 1 resize: 1 scale: 1 catalog_resolution: 4 reco_min: 4 train_min: 4 } } layer { name: "slice-label" type: "Slice" bottom: "label" top: "pixel-label" top: "bb-label" top: "size-label" top: "norm-label" slice_param { slice_dim: 1 slice_point: 1 slice_point: 5 slice_point: 7 } } layer { name: "pixel-block" type: "Concat" bottom: "pixel-label" bottom: "pixel-label" bottom: "pixel-label" bottom: "pixel-label" top: "pixel-block" concat_param { concat_dim: 1 } } layer { name: "size-block" type: "Concat" bottom: "size-label" bottom: "size-label" top: "size-block" concat_param { concat_dim: 1 } } layer { name: "norm-block" type: "Concat" bottom: "norm-label" bottom: "norm-label" bottom: "norm-label" bottom: "norm-label" top: "norm-block" concat_param { concat_dim: 1 } } layer { name: "bb-label-size-normalization" type: "Eltwise" bottom: "bb-label" bottom: "size-block" top: "bb-label-sn" eltwise_param { operation: PROD } } layer { name: "bb-label-num-pixel-normalization" type: "Eltwise" bottom: "bb-label-sn" bottom: "norm-block" top: "bb-label-sn-nn" eltwise_param { operation: PROD } } layer { name: "L0" type: "Convolution" bottom: "data" top: "L0" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 96 kernel_size: 11 stride: 4 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu1" type: "ReLU" bottom: "L0" top: "L0" } layer { name: "norm1" type: "LRN" bottom: "L0" top: "norm1" lrn_param { local_size: 5 alpha: 0.0005 beta: 0.75 k: 2 } } layer { name: "pool1" type: "Pooling" bottom: "norm1" top: "pool1" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "L1" type: "Convolution" bottom: "pool1" top: "L1" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 256 pad: 2 kernel_size: 5 group: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu2" type: "ReLU" bottom: "L1" top: "L1" } layer { name: "norm2" type: "LRN" bottom: "L1" top: "norm2" lrn_param { local_size: 5 alpha: 0.0005 beta: 0.75 k: 8 } } layer { name: "pool2" type: "Pooling" bottom: "norm2" top: "pool2" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "L2" type: "Convolution" bottom: "pool2" top: "L2" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } } } layer { name: "relu3" type: "ReLU" bottom: "L2" top: "L2" } layer { name: "L3" type: "Convolution" bottom: "L2" top: "L3" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } } } layer { name: "relu4" type: "ReLU" bottom: "L3" top: "L3" } layer { name: "L4" type: "Convolution" bottom: "L3" top: "L4" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 384 pad: 1 kernel_size: 3 group: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0.1 } } } layer { name: "relu5" type: "ReLU" bottom: "L4" top: "L4" } layer { name: "pool5" type: "Pooling" bottom: "L4" top: "pool5" pooling_param { pool: MAX kernel_size: 3 stride: 2 } } layer { name: "L5" type: "Convolution" bottom: "pool5" top: "L5" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 4096 pad: 3 kernel_size: 6 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu6" type: "ReLU" bottom: "L5" top: "L5" } layer { name: "drop6" type: "Dropout" bottom: "L5" top: "L5" dropout_param { dropout_ratio: 0.5 } } layer { name: "L6a" type: "Convolution" bottom: "L5" top: "L6a" param { lr_mult: 5 decay_mult: 0.1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 4096 kernel_size: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu7a" type: "ReLU" bottom: "L6a" top: "L6a" } layer { name: "drop7a" type: "Dropout" bottom: "L6a" top: "L6a" dropout_param { dropout_ratio: 0.5 } } layer { name: "L6b" type: "Convolution" bottom: "L5" top: "L6b" param { lr_mult: 5 decay_mult: 0.1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 4096 kernel_size: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu7b" type: "ReLU" bottom: "L6b" top: "L6b" } layer { name: "drop7b" type: "Dropout" bottom: "L6b" top: "L6b" dropout_param { dropout_ratio: 0.5 } } layer { name: "L6c" type: "Convolution" bottom: "L5" top: "L6c" param { lr_mult: 5 decay_mult: 0.1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 4096 kernel_size: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 1 } } } layer { name: "relu7c" type: "ReLU" bottom: "L6c" top: "L6c" } layer { name: "drop7c" type: "Dropout" bottom: "L6c" top: "L6c" dropout_param { dropout_ratio: 0.5 } } layer { name: "bb-output" type: "Convolution" bottom: "L6a" top: "bb-output" param { lr_mult: 20 decay_mult: 0.1 } param { lr_mult: 20 decay_mult: 0 } convolution_param { num_output: 256 kernel_size: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0 } } } layer { name: "pixel-conv" type: "Convolution" bottom: "L6b" top: "pixel-conv" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 128 kernel_size: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0 } } } layer { name: "type-conv" type: "Convolution" bottom: "L6c" top: "type-conv" param { lr_mult: 1 decay_mult: 1 } param { lr_mult: 2 decay_mult: 0 } convolution_param { num_output: 1024 kernel_size: 1 weight_filler { type: "gaussian" std: 0.01 } bias_filler { type: "constant" value: 0 } } } layer { name: "pixel-tile" type: "Tiling" bottom: "pixel-conv" top: "pixel-conv-tiled" tiling_param { tile_dim: 8 } } layer { name: "bb-tile" type: "Tiling" bottom: "bb-output" top: "bb-output-tiled" tiling_param { tile_dim: 8 } } layer { name: "type-tile" type: "Tiling" bottom: "type-conv" top: "type-conv-tiled" tiling_param { tile_dim: 4 } } layer { name: "pixel-loss" type: "SoftmaxWithLoss" bottom: "pixel-conv-tiled" bottom: "pixel-label" top: "pixel-loss" loss_weight: 1 } layer { name: "type-loss" type: "SoftmaxWithLoss" bottom: "type-conv-tiled" bottom: "type" top: "type-loss" loss_weight: 1 } layer { name: "bb-prob-mask" type: "Eltwise" bottom: "bb-output-tiled" bottom: "pixel-block" top: "bb-masked-output" eltwise_param { operation: PROD } } layer { name: "bb-size-normalization" type: "Eltwise" bottom: "bb-masked-output" bottom: "size-block" top: "bb-masked-output-sn" eltwise_param { operation: PROD } } layer { name: "bb-num-pixel-normalization" type: "Eltwise" bottom: "bb-masked-output-sn" bottom: "norm-block" top: "bb-masked-output-sn-nn" eltwise_param { operation: PROD } } layer { name: "bb-loss" type: "L1Loss" bottom: "bb-masked-output-sn-nn" bottom: "bb-label-sn-nn" top: "bb-loss" loss_weight: 3 } I0314 15:42:40.859165 6452 layer_factory.hpp:76] Creating layer data I0314 15:42:40.859302 6452 net.cpp:110] Creating Layer data I0314 15:42:40.859313 6452 net.cpp:432] data -> data I0314 15:42:40.859341 6452 net.cpp:432] data -> label I0314 15:42:40.859350 6452 net.cpp:432] data -> type I0314 15:42:40.859362 6452 data_transformer.cpp:23] Loading mean file from: ./driving_mean_train.binaryproto I0314 15:42:40.859751 6456 db_lmdb.cpp:22] Opened lmdb ./LMDB_train I0314 15:42:40.880087 6452 drive_data_layer.cpp:49] output data size: 24,3,480,640 I0314 15:42:40.983275 6452 net.cpp:155] Setting up data I0314 15:42:40.983322 6452 net.cpp:163] Top shape: 24 3 480 640 (22118400) I0314 15:42:40.983328 6452 net.cpp:163] Top shape: 24 8 120 160 (3686400) I0314 15:42:40.983333 6452 net.cpp:163] Top shape: 24 1 60 80 (115200) I0314 15:42:40.983340 6452 layer_factory.hpp:76] Creating layer slice-label I0314 15:42:40.983350 6452 net.cpp:110] Creating Layer slice-label I0314 15:42:40.983356 6452 net.cpp:476] slice-label <- label I0314 15:42:40.983366 6452 net.cpp:432] slice-label -> pixel-label I0314 15:42:40.983376 6452 net.cpp:432] slice-label -> bb-label I0314 15:42:40.983382 6452 net.cpp:432] slice-label -> size-label I0314 15:42:40.983387 6452 net.cpp:432] slice-label -> norm-label I0314 15:42:40.983397 6452 net.cpp:155] Setting up slice-label I0314 15:42:40.983402 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983405 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983409 6452 net.cpp:163] Top shape: 24 2 120 160 (921600) I0314 15:42:40.983413 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983417 6452 layer_factory.hpp:76] Creating layer pixel-label_slice-label_0_split I0314 15:42:40.983422 6452 net.cpp:110] Creating Layer pixel-label_slice-label_0_split I0314 15:42:40.983427 6452 net.cpp:476] pixel-label_slice-label_0_split <- pixel-label I0314 15:42:40.983438 6452 net.cpp:432] pixel-label_slice-label_0_split -> pixel-label_slice-label_0_split_0 I0314 15:42:40.983450 6452 net.cpp:432] pixel-label_slice-label_0_split -> pixel-label_slice-label_0_split_1 I0314 15:42:40.983458 6452 net.cpp:432] pixel-label_slice-label_0_split -> pixel-label_slice-label_0_split_2 I0314 15:42:40.983464 6452 net.cpp:432] pixel-label_slice-label_0_split -> pixel-label_slice-label_0_split_3 I0314 15:42:40.983469 6452 net.cpp:432] pixel-label_slice-label_0_split -> pixel-label_slice-label_0_split_4 I0314 15:42:40.983475 6452 net.cpp:155] Setting up pixel-label_slice-label_0_split I0314 15:42:40.983479 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983484 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983487 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983491 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983495 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983500 6452 layer_factory.hpp:76] Creating layer size-label_slice-label_2_split I0314 15:42:40.983503 6452 net.cpp:110] Creating Layer size-label_slice-label_2_split I0314 15:42:40.983507 6452 net.cpp:476] size-label_slice-label_2_split <- size-label I0314 15:42:40.983511 6452 net.cpp:432] size-label_slice-label_2_split -> size-label_slice-label_2_split_0 I0314 15:42:40.983517 6452 net.cpp:432] size-label_slice-label_2_split -> size-label_slice-label_2_split_1 I0314 15:42:40.983522 6452 net.cpp:155] Setting up size-label_slice-label_2_split I0314 15:42:40.983527 6452 net.cpp:163] Top shape: 24 2 120 160 (921600) I0314 15:42:40.983531 6452 net.cpp:163] Top shape: 24 2 120 160 (921600) I0314 15:42:40.983534 6452 layer_factory.hpp:76] Creating layer norm-label_slice-label_3_split I0314 15:42:40.983539 6452 net.cpp:110] Creating Layer norm-label_slice-label_3_split I0314 15:42:40.983542 6452 net.cpp:476] norm-label_slice-label_3_split <- norm-label I0314 15:42:40.983547 6452 net.cpp:432] norm-label_slice-label_3_split -> norm-label_slice-label_3_split_0 I0314 15:42:40.983552 6452 net.cpp:432] norm-label_slice-label_3_split -> norm-label_slice-label_3_split_1 I0314 15:42:40.983558 6452 net.cpp:432] norm-label_slice-label_3_split -> norm-label_slice-label_3_split_2 I0314 15:42:40.983566 6452 net.cpp:432] norm-label_slice-label_3_split -> norm-label_slice-label_3_split_3 I0314 15:42:40.983572 6452 net.cpp:155] Setting up norm-label_slice-label_3_split I0314 15:42:40.983575 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983579 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983583 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983587 6452 net.cpp:163] Top shape: 24 1 120 160 (460800) I0314 15:42:40.983590 6452 layer_factory.hpp:76] Creating layer pixel-block I0314 15:42:40.983597 6452 net.cpp:110] Creating Layer pixel-block I0314 15:42:40.983600 6452 net.cpp:476] pixel-block <- pixel-label_slice-label_0_split_0 I0314 15:42:40.983604 6452 net.cpp:476] pixel-block <- pixel-label_slice-label_0_split_1 I0314 15:42:40.983608 6452 net.cpp:476] pixel-block <- pixel-label_slice-label_0_split_2 I0314 15:42:40.983613 6452 net.cpp:476] pixel-block <- pixel-label_slice-label_0_split_3 I0314 15:42:40.983618 6452 net.cpp:432] pixel-block -> pixel-block I0314 15:42:40.983625 6452 net.cpp:155] Setting up pixel-block I0314 15:42:40.983629 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983633 6452 layer_factory.hpp:76] Creating layer size-block I0314 15:42:40.983638 6452 net.cpp:110] Creating Layer size-block I0314 15:42:40.983641 6452 net.cpp:476] size-block <- size-label_slice-label_2_split_0 I0314 15:42:40.983645 6452 net.cpp:476] size-block <- size-label_slice-label_2_split_1 I0314 15:42:40.983650 6452 net.cpp:432] size-block -> size-block I0314 15:42:40.983655 6452 net.cpp:155] Setting up size-block I0314 15:42:40.983660 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983664 6452 layer_factory.hpp:76] Creating layer size-block_size-block_0_split I0314 15:42:40.983670 6452 net.cpp:110] Creating Layer size-block_size-block_0_split I0314 15:42:40.983677 6452 net.cpp:476] size-block_size-block_0_split <- size-block I0314 15:42:40.983682 6452 net.cpp:432] size-block_size-block_0_split -> size-block_size-block_0_split_0 I0314 15:42:40.983688 6452 net.cpp:432] size-block_size-block_0_split -> size-block_size-block_0_split_1 I0314 15:42:40.983693 6452 net.cpp:155] Setting up size-block_size-block_0_split I0314 15:42:40.983698 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983701 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983705 6452 layer_factory.hpp:76] Creating layer norm-block I0314 15:42:40.983711 6452 net.cpp:110] Creating Layer norm-block I0314 15:42:40.983714 6452 net.cpp:476] norm-block <- norm-label_slice-label_3_split_0 I0314 15:42:40.983718 6452 net.cpp:476] norm-block <- norm-label_slice-label_3_split_1 I0314 15:42:40.983723 6452 net.cpp:476] norm-block <- norm-label_slice-label_3_split_2 I0314 15:42:40.983727 6452 net.cpp:476] norm-block <- norm-label_slice-label_3_split_3 I0314 15:42:40.983732 6452 net.cpp:432] norm-block -> norm-block I0314 15:42:40.983737 6452 net.cpp:155] Setting up norm-block I0314 15:42:40.983742 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983745 6452 layer_factory.hpp:76] Creating layer norm-block_norm-block_0_split I0314 15:42:40.983749 6452 net.cpp:110] Creating Layer norm-block_norm-block_0_split I0314 15:42:40.983753 6452 net.cpp:476] norm-block_norm-block_0_split <- norm-block I0314 15:42:40.983757 6452 net.cpp:432] norm-block_norm-block_0_split -> norm-block_norm-block_0_split_0 I0314 15:42:40.983762 6452 net.cpp:432] norm-block_norm-block_0_split -> norm-block_norm-block_0_split_1 I0314 15:42:40.983767 6452 net.cpp:155] Setting up norm-block_norm-block_0_split I0314 15:42:40.983772 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983777 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983779 6452 layer_factory.hpp:76] Creating layer bb-label-size-normalization I0314 15:42:40.983784 6452 net.cpp:110] Creating Layer bb-label-size-normalization I0314 15:42:40.983788 6452 net.cpp:476] bb-label-size-normalization <- bb-label I0314 15:42:40.983793 6452 net.cpp:476] bb-label-size-normalization <- size-block_size-block_0_split_0 I0314 15:42:40.983796 6452 net.cpp:432] bb-label-size-normalization -> bb-label-sn I0314 15:42:40.983804 6452 net.cpp:155] Setting up bb-label-size-normalization I0314 15:42:40.983809 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983813 6452 layer_factory.hpp:76] Creating layer bb-label-num-pixel-normalization I0314 15:42:40.983816 6452 net.cpp:110] Creating Layer bb-label-num-pixel-normalization I0314 15:42:40.983820 6452 net.cpp:476] bb-label-num-pixel-normalization <- bb-label-sn I0314 15:42:40.983824 6452 net.cpp:476] bb-label-num-pixel-normalization <- norm-block_norm-block_0_split_0 I0314 15:42:40.983829 6452 net.cpp:432] bb-label-num-pixel-normalization -> bb-label-sn-nn I0314 15:42:40.983834 6452 net.cpp:155] Setting up bb-label-num-pixel-normalization I0314 15:42:40.983839 6452 net.cpp:163] Top shape: 24 4 120 160 (1843200) I0314 15:42:40.983841 6452 layer_factory.hpp:76] Creating layer L0 I0314 15:42:40.983849 6452 net.cpp:110] Creating Layer L0 I0314 15:42:40.983852 6452 net.cpp:476] L0 <- data I0314 15:42:40.983857 6452 net.cpp:432] L0 -> L0 *** Aborted at 1552549361 (unix time) try "date -d @1552549361" if you are using GNU date *** PC: @ 0x7f65c6316e7c caffe::CuDNNConvolutionLayer<>::LayerSetUp() *** SIGSEGV (@0xea) received by PID 6452 (TID 0x7f65c6a09740) from PID 234; stack trace: *** @ 0x7f65c4ac4390 (unknown) @ 0x7f65c6316e7c caffe::CuDNNConvolutionLayer<>::LayerSetUp() @ 0x7f65c63e6614 caffe::Net<>::Init() @ 0x7f65c63e8491 caffe::Net<>::Net() @ 0x7f65c62644ba caffe::Solver<>::InitTrainNet() @ 0x7f65c62657af caffe::Solver<>::Init() @ 0x7f65c6265b29 caffe::Solver<>::Solver() @ 0x41503d caffe::GetSolver<>() @ 0x40b721 train() @ 0x408840 main @ 0x7f65c4709830 __libc_start_main @ 0x408f99 _start @ 0x0 (unknown)

screamdw avatar Mar 14 '19 07:03 screamdw

@screamdw same problem, did you solve the problem?

ycdhqzhiai avatar Aug 02 '19 06:08 ycdhqzhiai

@SeokjuLee @screamdw @ycdhqzhiai same problem.

peterlee909 avatar Nov 01 '19 03:11 peterlee909