Charrin

Results 12 comments of Charrin

offset有默认参数,在caffe源码的caffe.proto可以看到

> The inference cost time is more than 200ms in my device... while I use 4 threads to match the core num... (network input size is 640*480) > However ,...

if you use caffe, replace cube to "caffe:Blob"

参考mxnet c++接口的例子

@wudrans 这个不是我提交的,你可以看一下param的模型结构或者bin的大小,我觉得应该是mnet。。。

> Hi! Could you provide onnx model for R50? sorry, I didn't use onnx to convert the model

> inference time 11.890137 > inference time 77.138916 > inference time 14.939941 > inference time 153.685059 > inference time 11.651123 > inference time 11.635986 > inference time 18.880127 > inference...

> Hi Charrin: > > ``` > RokidCNN is so fast compared to NCNN, could you introduce it briefly? many thanks. > ``` @ForestWang The source code will be released...

> 你好! > 我在转换模型时遇到一些问题,请问可以给我您转换过程中用到的onnx模型吗 > 或者方便的话发送到我的邮箱[[email protected]](mailto:[email protected]) 感谢 你好,我不是使用onnx转换的

uniform_data_layer is modified from image_data_layer, you can use vimdiff to show the difference between them, I think the version of opencv has no impact on it. If you find some...