mkulariya

Results 3 comments of mkulariya

facing the same issue, supplying max_batch_size=64 at the time of model conversion from pytorch to trt using batch_size=4 at the time of inference it used to work fine some time...

@chaoz-dev 1. no at the time of model conversion, I am supplying tensor of batch size 1. trt_model = torch2trt(model, [dummy_input], max_batch_size=64) here shapes of the dummy input is 1,...

@chaoz-dev I meant to say, using the old version temporary for conversion as the current version is not working. are you suggesting that the size of input should be the...