Vito Plantamura
Vito Plantamura
This is the same error we were getting at the beginning. We are stuck. Can you try with the Dockerfile? If you open the Dockerfile, at the beginning, there are...
It will most likely work. But you need to install Docker first. Vito
hi, when you run the script, what error do you get? Vito
mmh, I tried to export the onnx file with transformers==4.46.3 and it gives this error: `please report a bug to PyTorch. We don't have an op for aten::view` In these...
yes, onnxsim_large_model must be used before converting from onnx to the OnnxStream compatible file format. Regarding --no-fp16, your architecture apparently does not support FP16 arithmetic/conversion instructions. Vito
Thanks for reporting this. I'll test it with a RPI as soon as possible. Thanks, Vito
I just committed the fix. The problem was caused by one of the latest commits, as part of the XNNPACK version update. Thanks, Vito
hi @DavidBachmann , Your specific error should be caused by an outdated version of GCC. Can you try updating it? Vito
hi, your error seems to be related to this: https://github.com/google/XNNPACK/issues/1465 Let me know if you need further assistance, Vito
hi, where did you get the .exe file? Did you build it yourself? Vito