Anakin
Anakin copied to clipboard
Wrong shape of input tensor.
Thanks for the great library. I want to run my model on ARM and model was converted successfully. I follow the arm inference example classification.cpp. Model loaded successfully and graph.get_outs() returns OK. But d_tensor_in_p->valid_shape() returns all 0. I can see the correct shapes on web model dashboard.
Net<ARM, AK_FLOAT, Precision::FP32, OpRunType::SYNC> net_executer(graph, ctx1, true);
//! get in
LOG(INFO) << "get input";
auto d_tensor_in_p = net_executer.get_in("input_0");
auto valid_shape_in = d_tensor_in_p->valid_shape();
for (int i = 0; i < valid_shape_in.size(); i++) {
LOG(INFO) << "detect input dims[" << i << "]" << valid_shape_in[i];
}
I've used exactly same code from classification.cpp. Hope to any suggestions.
could you please paste the log when creating the net? it should look like following:
0| 13:21:58.00045| 0.184s| 10| classification.cpp:115] create net to execute WAN| 13:21:58.00045| 0.184s| 10| net.cpp:643] Detect and initial 1 lanes. WAN| 13:21:58.00045| 0.184s| 10| net.cpp:645] Current used device id : 0 WAN| 13:21:58.00046| 0.184s| 10| input.cpp:16] Parsing Input op parameter. 0| 13:21:58.00046| 0.184s| 10| input.cpp:19] |-- shape [0]: 1 0| 13:21:58.00046| 0.184s| 10| input.cpp:19] |-- shape [1]: 3 0| 13:21:58.00046| 0.184s| 10| input.cpp:19] |-- shape [2]: 224 0| 13:21:58.00046| 0.184s| 10| input.cpp:19] |-- shape [3]: 224