tensorrt
tensorrt copied to clipboard
TensorFlow/TensorRT integration
What is the meaning of [lower_control_flow](https://github.com/tensorflow/tensorflow/blob/765ac8d16eff6d6ff997ee73809b402d8b1194ae/tensorflow/python/framework/convert_to_constants.py#L380) in TensorRT? We're seeing different issues setting lower_control_flow toTrue or False.
I'm wondering whether TensorRT supports while loop under tensorflow 2.4. We want to convert a transformer based model to TensorRT but we're having issue with the while loop in transformer...
hi, SSD postproessor combined with **while loop and tensorarray**, how tensorRT deal with control flow op? enter merge switch etc. how to allocate memory for tensorarray object. These op confused...
Error massage: Traceback (most recent call last): File "run_webcam.py", line 8, in from tf_pose.estimator import TfPoseEstimator File "C:\Users\james\Desktop\pose estimation\tf-pose-estimation\tf_pose\__init__.py", line 5, in from tf_pose.runner import infer, Estimator, get_estimator File "C:\Users\james\Desktop\pose...
We tested it following the blog [Leveraging TensorFlow-TensorRT integration for Low latency Inference](https://blog.tensorflow.org/2021/01/leveraging-tensorflow-tensorrt-integration.html), and got **a very large saved model** **ENV** - Tensorflow: 2.4.1 - TensorRT: 6.0.1 - Cuda: 10.1...
Hi, I've used the following code to convert: ``` # Convert SavedModel using TF-TRT def convert_model_to_trt(): params = trt.DEFAULT_TRT_CONVERSION_PARAMS._replace( precision_mode=trt.TrtPrecisionMode.FP16, is_dynamic_op=True) # is_dynamic_op=True converter = trt.TrtGraphConverterV2( input_saved_model_dir="./tmp", conversion_params=params) converter.convert() converter.save("./tmp/yolov3_trt")...
Hi, I'm successfully able to convert model to FP32 TensorRT as well as FP16 TensorRT as long as my batch size is set to 1. But, as soon as I...
Why object detection in TF 2.0 doesn't include force_nms_cpu optimization as it was included in the branch r1.14?
Branch: r1.14 Hi all, I am using the script `https://github.com/tensorflow/tensorrt/tree/r1.14%2B/tftrt/examples/object_detection` to optimize the model faster_rcnn_inception_v2, although the test was completed successfully I got the below warning: ``` WARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/profiler/internal/flops_registry.py:142: tensor_shape_from_node_def_name...
I used TF-TRT to optimize the frozen inference graph of Faster R-CNNtrained model. But there is no improvement in inference speed. Is tensorrt supported with Faster RCNN because in recent...