TensorRT
TensorRT copied to clipboard
NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
## Description What does “Reformatting CopyNode for Input Tensor” mean in trtexec' dump profile ## Environment **TensorRT Version**: **NVIDIA GPU**: **NVIDIA Driver Version**: **CUDA Version**: **CUDNN Version**: **Operating System**: **Python...
## Description /TensorRT/samples/python/efficientdet/python/create_onnx.py sample script fails in update_shapes() function for checkpoint created via finetuning an efficientdet-d4 pretrained model (no error when repeating process based on efficientdet-d7 model). ``` Traceback (most...
## Description Anyone knows about this issue? I used the two patchs provide by NVIDIA official website for cuda 10.2, but it only works for model converting from onnx to...
## Description I'm using pytorch quantization toolkit to quantize my model, which has some conv3d module. The QAT procedure is OK. But when i use trtexec to convert the onnx...
Hi, I am trying to convert an Onnx model with dynamic inputs to TensorRT format but encounter an error about "Could not find any implementation for node...". Can someone please...
## Description I tried building tkdnn on wsl2 with TensorRt 8.4 GA ,cudnn 8.4 and cuda 11.7 , the library and its tests compile without any issues but when i...
## Description The TF-TRT converter fails to save the model if it uses a lookup table. [This colab](https://colab.research.google.com/drive/1GvJy2rg8S5GJ37xK3Ek9XY0pyi-AfdyN) illustrates how I create a simple model that uses a Keras `IndexLookup`...
## Description When inserting sleep(); between inference, the inference time becomes much longer ``` c++ for (int i = 0; i < 10000; i++) { auto t1 = high_resolution_clock::now(); context->executeV2(bindings);...
We have converted the caffe model to .engine by trtexec tools, how to convert this engine file to plan file?