DeepStream-Yolo
DeepStream-Yolo copied to clipboard
WARNING: [TRT]: Unknown embedded device detected.
Specs: Nvidia Jetson Orin 64GB Deepstream Version: 6.2 Deepstream-Yolo: Latest
When I convert a YoloV7, YoloV8 or YOLO-NAS ONNX file to an engine file using the Nvidia Jetson Orin, I get the following warning constantly being output:
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
WARNING: [TRT]: Unknown embedded device detected. Using 59655MiB as the allocation cap for memory on embedded devices.
This is continuously printed to terminal until the .engine file is created. Is this anything to be concerned about?
At the end I get the following messages:
WARNING: [TRT]: TensorRT encountered issues when converting weights between types and that could affect accuracy.
WARNING: [TRT]: If this is not the desired behavior, please modify the weights or retrain with regularization to adjust the magnitude of the weights.
WARNING: [TRT]: Check verbose logs for the list of affected weights.
WARNING: [TRT]: - 72 weights are affected by this issue: Detected subnormal FP16 values.
WARNING: [TRT]: - 11 weights are affected by this issue: Detected values less than smallest positive FP16 subnormal value and converted them to the FP16 minimum subnormalized value.
Building complete
Any concerns here?
Thanks
When I convert a YoloV7, YoloV8 or YOLO-NAS ONNX file to an engine file using the Nvidia Jetson Orin, I get the following warning constantly being output:
Probably this is a issue from NVIDIA side with the JetPack components (mainly the TensorRT for the Orin series).
At the end I get the following messages:
This is normal warnings, you can ignore it.
How long did it take for your Orin to generate the .engine file? I got exactly same behavior.....It's been printing forever....
How long did it take for your Orin to generate the .engine file? I got exactly same behavior.....It's been printing forever....
I am using the Jetson Orin 64GB and it took about 15-20 minutes. If you are using something smaller(i.e Nano or 32GB) it could take like an hour or so.
It takes a very long time, it's a TensorRT issue.
same problem with other model. Now it is going for 1 hour and i don't know when is the end.
When I build on the Jetson Orin 32GB it takes ~10 minutes to build the engine file for a YoloV7-small model. What size model are you using?
On Thu, Aug 1, 2024 at 6:23 AM vodan37 @.***> wrote:
same problem with other model. Now it is going for 1 hour and i don't know when is the end.
— Reply to this email directly, view it on GitHub https://github.com/marcoslucianops/DeepStream-Yolo/issues/420#issuecomment-2262688152, or unsubscribe https://github.com/notifications/unsubscribe-auth/AECGUC7SIKN5UO5N6V5Q3QTZPIEB5AVCNFSM6AAAAABL2IDDGWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENRSGY4DQMJVGI . You are receiving this because you authored the thread.Message ID: @.***>