TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

What is allocation memory and why the engine need to use so large activation memory?

Open frankxyy opened this issue 3 years ago • 1 comments

e9c6e6fd9dfc5e5193f82692a09e8f41

As you can see in the picture,the engine generated by tensorrt uses such a huge amount of activation memory. I wonder how I can decrease this usage.

The tensorrt engine generation script is:

trtexec --onnx=det.onnx  --workspace=2000 --explicitBatch --minShapes=x:1x3x96x96 --optShapes=x:2x3x2208x2208 --maxShapes=x:4x3x5184x5184 --shapes=x:1x3x2208x2208 --saveEngine=model.plan --fp16 --tacticSources=-CUDNN,-CUBLAS --memPoolSize=workspace:2000 --device=3

frankxyy avatar Sep 04 '22 13:09 frankxyy

@nvpohanh any suggestion here? I would suspect this is due to the model itself and the large input resolution.

zerollzeng avatar Sep 05 '22 08:09 zerollzeng

@frankxyy Could you try adding --tacticSources=-EDGE_MASK_CONVOLUTIONS flag? Some tactics in TRT pre-compute the edge masks and save the masks in engines.

nvpohanh avatar Dec 02 '22 09:12 nvpohanh

Closing since no activity for more than 3 weeks, please reopen if you still have question, thanks!

ttyio avatar Jan 10 '23 02:01 ttyio