ncnn-android-yolov7
ncnn-android-yolov7 copied to clipboard
onnx2ncnn.exe problem
Thank you for your great work! It is very useful. But when I use onnx2ncnn.exe in ncnn to transfer yolov7.onnx into bin and param, I got these errors:
Unsupported slice axes ! ScatterND not supported yet! Unsupported slice axes ! Expand not supported yet! ScatterND not supported yet! Unsupported slice axes ! ScatterND not supported yet! Unsupported slice axes ! Expand not supported yet! ScatterND not supported yet! Unsupported slice axes ! ScatterND not supported yet! Unsupported slice axes ! Expand not supported yet! ScatterND not supported yet!
And the created bin and param can not work in my android phone by your code. While when I can use the default bin and param in your code, it works.
I use the way you mentioned in https://github.com/xiang-wuu/ncnn-android-yolov7/issues/2 to transfer yolov7.pt into yolov7.onnx.
Thank you for your answer!
@kizoooh , did you solve this issue? I have a same problem.
you can use yolov7 export.py export xx.torchscript.pt. And then use pnnx to exchange ncnn xx.bin and xx.param
@kizoooh , did you solve this issue? I have a same problem.
Hi, I did not save the issue. And I used export.py to torchscript.pt and pnnx justlike @chenzj-king says, also not work.
It's OK when I transfer .pt to torchscript, but I got this when I run pnnx.exe:
And this is my final .param file, it is a param file but I can not use it in my Android phone.
This is the comparison of my .param(transfered from yolov7_tiny.pt) and the author's yolov7_tiny.param:
The left is mine, and I got several lines different. I think this is because of those "unknown parameter"s in the first picture, but I can not solve it.
you can use yolov7 export.py export xx.torchscript.pt. And then use pnnx to exchange ncnn xx.bin and xx.param
Thank you so much for your reply. I tried your method but stucked at
@kizoooh , did you solve this issue? I have a same problem.
Hi, I did not save the issue. And I used export.py to torchscript.pt and pnnx justlike @chenzj-king says, also not work. It's OK when I transfer .pt to torchscript, but I got this when I run pnnx.exe:
And this is my final .param file, it is a param file but I can not use it in my Android phone. This is the comparison of my .param(transfered from yolov7_tiny.pt) and the author's yolov7_tiny.param:
The left is mine, and I got several lines different. I think this is because of those "unknown parameter"s in the first picture, but I can not solve it.
Did you know how to solve it? Thank you again for your advice!
@kizoooh Hi, did you solve that? same problem here 😭
@kizoooh Hi, did you solve the that? same problem here 😭
Not yet😭 Waiting for a reply
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
Thank you for the reply, I have tried this but it didn't work. Maybe I should find some problem myself because I see a lot of people worked successfully using this method.
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
My pnnx didn't work with the command ./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape2=[1,3,320,320]
, maybe i'm not exporting to the torchscript properly. Would you like to share your command exporting .pt
to .torchscrip.pt
and .onnx
? thanks a lot ! 😃
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
My pnnx didn't work with the command
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape2=[1,3,320,320]
, maybe i'm not exporting to the torchscript properly. Would you like to share your command exporting.pt
to.torchscrip.pt
and.onnx
? thanks a lot ! 😃
Same problem! @mhyeonsoo Waiting for your .pt to .torchscript.pt command, thank you so much!
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
My pnnx didn't work with the command
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape2=[1,3,320,320]
, maybe i'm not exporting to the torchscript properly. Would you like to share your command exporting.pt
to.torchscrip.pt
and.onnx
? thanks a lot ! 😃Same problem! @mhyeonsoo Waiting for your .pt to .torchscript.pt command, thank you so much!
I tried exporting the weight with command python export.py --weights best.pt
and it works. but the detection result seems to be not so good. Still dont understand why. Maybe you can give it a try 😄
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
My pnnx didn't work with the command
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape2=[1,3,320,320]
, maybe i'm not exporting to the torchscript properly. Would you like to share your command exporting.pt
to.torchscrip.pt
and.onnx
? thanks a lot ! 😃Same problem! @mhyeonsoo Waiting for your .pt to .torchscript.pt command, thank you so much!
I tried exporting the weight with command
python export.py --weights best.pt
and it works. but the detection result seems to be not so good. Still dont understand why. Maybe you can give it a try 😄
I tried it and works, but I didn't change any command based on the last try, I dont know why, but anyway it finally works. Thank you so much!
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
My pnnx didn't work with the command
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape2=[1,3,320,320]
, maybe i'm not exporting to the torchscript properly. Would you like to share your command exporting.pt
to.torchscrip.pt
and.onnx
? thanks a lot ! smileySame problem! @mhyeonsoo Waiting for your .pt to .torchscript.pt command, thank you so much!
I tried exporting the weight with command
python export.py --weights best.pt
and it works. but the detection result seems to be not so good. Still dont understand why. Maybe you can give it a try smileI tried it and works, but I didn't change any command based on the last try, I dont know why, but anyway it finally works. Thank you so much!
I used the command below.
python export.py --weights best.pt --grid --end2end --simplify --topk-all 100 --iou-thres 0.65 --conf-thres 0.45 --img-size 640 640 --max-wh 640
Reason of low detection performance might be another problem which may not be related to this converting issue. Thanks!
Did you set input shapes at argument? I used the command like
./pnnx best.torchscript.pt inputshape=[1,3,640,640] inputshape=[1,3,320,320]
and then,
./onnx2ncnn best.onnx model.param model.bin
with these commands above, I could see the output bounding boxes successfully on my android device. Thanks,
Another question, how can you use pnnx
and onnx2ncnn
at the same time? My pnnx
cant output .onnx
for onnx2ncnn
😢