MaixPy-v1
MaixPy-v1 copied to clipboard
Problem with larger models
I'm trying to get model as large as possible through load_flash API calls, however i've noticed that using Darknet modification of tiny yolo which goes fine through nncase (all of them beta2,beta3 and beta4) with model being 4.4 Mb i'm not able to get inference. Return from flash on sipeed_kpu_err_t returns no error. Return from inference is None. I was playing with C calls in the firmware side, and i've noticed many of the function calls seem to work in nncase v1 RC5 version but not in the v2 such as get model size, get info on layers and etc. I was also wondering if it is possible to get source code of sipeed_kpu_model_load_flash() to see where bug is taking place.
I think it is related to darkflow due to corrupted ckpt to pb file conversion.
P.S. It is a definetely flaw with Darkflow. Darknet works fine though.
Probably, this part is not open source, but it does not prevent you from using the NNCase interface directly
you can point out which API you need have bug, we can fix it