007

Results 10 comments of 007

This is the first version. You can do that

paste more error information

Thanks a lot! "of course except for the missing conv layers", I think it's may be "binary layers". Thanks again

同问,THUDM的fastertransformer支持6b模型吗?

1, no need. set "Relu" Layer as this: layer { name: "relu4" type: "PReLU" prelu_param { filler {type: "constant" value: 0.1} } bottom: "conv4" top: "conv4" } 2, All my...

1, use code in "crop_celeba" to prepare your samples 2, use "tools/convert_darknet_model_2_caffe_model.cpp" to transform your data to lmdb

Thanks 1, It should be "datum_image" instead of "data_image" in line 168. 2, I will update some samples, My preparing code will coming soon. 3, I use celebA database to...

[L2 normalize layer](https://github.com/happynear/caffe-windows/blob/master/src/caffe/layers/normalize_layer.cpp)

Sorry, I didn't try this in windows