Chenjg02

Results 2 issues of Chenjg02

Currently, onnx inference takes 3–4 seconds per image. How can we accelerate it? (We plan to convert .onnxl to .om format, aiming for 0.5 seconds per image.)

### 📚 The doc issue Currently, onnx inference takes 3–4 seconds per image. How can we accelerate it? (We plan to convert .onnxl to .om format, aiming for 0.5 seconds...