Daniel Li

Results 132 comments of Daniel Li

@marcoslucianops I don't know exact model version, but I have downloaded [here](https://docs.ultralytics.com/models/yolov8/#supported-tasks-and-modes). export from pt to onnx command: ```bash $ yolo export model=yolov8s.pt format=onnx $ yolo version 8.3.33 ``` Attached...

@marcoslucianops The result is the same (no bounding boxes), but I got the right way to generate onnx file, thanks. PS: delete `model_b1_gpu0_fp32.engine` before run `deepstream -c deep*`. - [yolov8s.pt.onnx](https://drive.google.com/file/d/1PqJ0lYyLROLyWYDi-q5KzKUzITTFn_IP/view?usp=sharing)

@marcoslucianops We are trying to reproduce the performance mentioned in the following [link](https://docs.ultralytics.com/guides/deepstream-nvidia-jetson/), which claims to achieve [181 FPS with INT8 precision on Jetson Orin NX](https://github.com/ultralytics/ultralytics/issues/17640#issuecomment-2537927750). However, we are currently...

@marcoslucianops This PyTorch is from [nvidia binary release](https://developer.download.nvidia.cn/compute/redist/jp/v512/pytorch/torch-2.1.0a0+41361538.nv23.06-cp38-cp38-linux_aarch64.whl) ``` Python Environment: Python 3.8.10 GStreamer: YES (1.16.3) NVIDIA CUDA: YES (ver 11.4, CUFFT CUBLAS FAST_MATH) OpenCV version: 4.9.0 CUDA True YOLO...

> Can you try `v2.0.0` and `v1.14.0`? No, on this board runing jetpack 5.1.3/5.1.4 there is only one release version [torch-2.1.0a0+41361538.nv23.06-cp38-cp38-linux_aarch64.whl](https://developer.download.nvidia.cn/compute/redist/jp/v512/pytorch/torch-2.1.0a0+41361538.nv23.06-cp38-cp38-linux_aarch64.whl), which is in [developer.download.nvidia.cn/compute/redist/jp/v512/pytorch/](https://developer.download.nvidia.cn/compute/redist/jp/v512/pytorch/). Old versions(links) might be removed...

@marcoslucianops Do you mean it might be related with PyTorch version? - [NG] As I'm using 2.1 (from nvidia), I'm NOT sure what version @PaoXi was using. - [OK] From...

> @marcoslucianops Do you mean it might be related with PyTorch version? Well, don't have much time today. But ... ... **It's great! It works using below exported onnx file...

> By the way, could you share more details about how you exported this ONNX version? I'm curious if there are specific steps or tweaks you used that made it...

@sdhamodaran It might be something to do with onnxruntime. Check onnxruntime version into account also.

@sdhamodaran I use onnxruntime_gpu-1.17.0-cp38-cp38-linux_aarch64.whl, which comes from nvidia, on jetson orin. I have upgraded pytorch to [pytorch-v2.5.1+l4t35.6-cp38-cp38-aarch64](https://github.com/SnapDragonfly/pytorch/releases/tag/v2.5.1%2Bl4t35.6-cp38-cp38-aarch64) on jetson orin L4T 35.6 Jetpack 5.1.4. And it seems still got trouble....