Zero Zeng

Results 582 comments of Zero Zeng

https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#trtexec

I can't export the onnx using your script, can you share the onnx model here? ``` $ python3 2125.py Traceback (most recent call last): File "2125.py", line 8, in torch.onnx.export(model....

@YohannXu I still met error using your export code: ``` import torch from pytorch_quantization import quant_modules import pytorch_quantization.nn as quant_nn quant_modules.initialize() model = nn.Conv3d(3, 16, kernel_size=(3, 3, 3), stride=(1, 1,1),...

Looks like they are not modified.

I can't reproduce it in TRT 8.4, can you try 8.4? ``` TensorRT-8.4.1.5/bin/trtexec --onnx=conv3d_quant.onnx --int8 ```

I'm developing a tool that can produce the TRT plugin from a simple config file, so the user only needs to focus on the kernel implementation of the operator :)

After checking your model with [polygraphy](https://github.com/NVIDIA/TensorRT/tree/master/tools/Polygraphy), Conv_161 and Conv_157 's output seem problematic in TRT, will update here if I have some discoveries.

The error is indicative, remove the plan and regenerate it with the correct device. e.g. with setting CUDA_VISIBLE_DEVICES