Zero Zeng
Zero Zeng
@azhurkevich ^ ^
Onehot is not supported now, but maybe it can be implemented as a plugin.
@ttyio for viz
`trtexec --onnx=model.onnx --verbose > log.txt` https://askubuntu.com/questions/420981/how-do-i-save-terminal-output-to-a-file
You can try constant folding the ONNX model: https://github.com/NVIDIA/TensorRT/tree/main/tools/Polygraphy/examples/cli/surgeon/02_folding_constants
The reason why is failed it's simple: TRT expect your Range_290's range is an initializer, not a tensor. you can compare your static model and the dynamic model to see...
TRT 8.4 should be in Jetpack 5.0, which will release soon.
This is indeed a bug, I think upgrading to Jetpack 5.0 is the only option.
reflash it? but I think you will have this issue in 4.5 too. and perhaps there might be even no JP4.5 version for this devce.
Hi guys, we just release the [TensorRT_8.2.1.9_Patch_for_Jetpack4.6_Jetson_NX_16GB.tar.gz](https://developer.nvidia.com/tensorrt-8219-patch-jetpack46-jetson-nx-16gbtargz) for this issue, please see https://developer.nvidia.com/embedded/linux-tegra-r3272