mxnet-mobilenet-v2 icon indicating copy to clipboard operation
mxnet-mobilenet-v2 copied to clipboard

Error on inference

Open foromer4 opened this issue 6 years ago • 3 comments

HI Running the script from_mxnet.py on hikey 970 board with arm5 and MALI GPU, I get this error: LLVM ERROR: Only small and large code models are allowed on AArch64 If I disable LLVM I get an error that it is disabled. any suggestions? thanks

foromer4 avatar Oct 08 '18 13:10 foromer4

can you trace the error with pdb to see which line of the code generates such error? on the other hand, I would suggest building independent tvm runtime on your host, and use rpc to communicate with your hikey 970 device, so that you only need to ensure opencl backend on your device is available.

liangfu avatar Oct 08 '18 15:10 liangfu

thanks for the quick response, the error is in this line: https://github.com/liangfu/mxnet-mobilenet-v2/blob/master/from_mxnet.py#75 compiling the graph. Regarding your suggestion, On my host machine (ubuntu 16.04 , intel i7 cpu) the script also fails - because although I have opencl installed it can't find any opencl devices. Also, since I want to measure performance on the hikey board , I am also afraid than RPC might not give accurate results (?)

foromer4 avatar Oct 09 '18 11:10 foromer4

i think the root cause of this issue might be your opencl driver, and a correct way to compile tvm with opencl, not quite related to running mobilenetv2 inference. therefore, i would recommend post any tvm related question on http://discuss.tvm.ai .

liangfu avatar Oct 09 '18 16:10 liangfu