Zero Zeng
Zero Zeng
https://netron.app/
> @zerollzeng Is there a way to map the problematic operator in onnx to the torch model code? I tried to find the answer before but failed finally :-( so...
> @zerollzeng I observed that there is a parameter max_workspace_size, which may be the largest batch size when exporting the model. What determines max_workspace_size? Will fp16 cause max_workspace_size to become...
the error is raise in here: https://github.com/onnx/onnx-tensorrt/blob/1da7332349d5b1196ccfa6dc719b839876f1e83e/onnx2trt_utils.cpp#L2265 it's happened during parse the onnx, you can check the node 4622 in you onnx model. or share it here so that I...
Do you use dynamic shape? looks like your model doesn't support dynamic shape or you input dimension is invalid: ``` [E] 4: [shapeCompiler.cpp::evaluateShapeChecks::911] Error Code 4: Internal Error (kOPT values...
I can't reproduce your error on my side because your model contains your own plugin: ``` [08/22/2022-15:37:14] [I] [TRT] No importer registered for op: grid_sampler. Attempting to import as plugin....
My command using trtexec: ``` &&&& FAILED TensorRT.trtexec [TensorRT v8401] # trtexec --onnx=end2end_new.onnx --optShapes=input:1x3x720x1296 ```
@nvpohanh @kevinch-nv Is this the best practice?
The JP 4.4 should contain TensorRT 7.1.3, why do you need to install it manually?
You modified our code so it's your responsibility to make it works, I will try to see if there is a bug in https://github.com/NVIDIA/TensorRT/issues/2302 first.