TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

🐛 [Bug] Error Code 2: OutOfMemory (no further information)

Open wuhongsheng opened this issue 1 year ago • 1 comments

Bug Description

To Reproduce

Steps to reproduce the behavior:

input_data = torch.rand([1, 3, 1280, 720]).cuda(device) print(type(input_data))

# input_data = input_data.to(device)

# Trace the module with example data
traced_model = torch.jit.trace(model, [input_data])
print("torch trace success")

trt_ts_module = torch_tensorrt.compile(
    traced_model, inputs=[input_data], ir= "ts", enabled_precisions = {torch.half})
print("torch compile success")

Expected behavior

Environment

Build information about Torch-TensorRT can be found by turning on debug messages

  • Torch-TensorRT Version (e.g. 1.0.0): 2.0.1
  • PyTorch Version (e.g. 1.0):1.4.0
  • CPU Architecture:
  • OS (e.g., Linux):
  • How you installed PyTorch (conda, pip, libtorch, source):
  • Build command you used (if compiling from source):
  • Are you using local sources or building from archives:
  • Python version:3.8
  • CUDA version:
  • GPU models and configuration:rtx3090
  • Any other relevant information:

Additional context

wuhongsheng avatar Mar 07 '24 09:03 wuhongsheng

This is model dependent, if you have a large model, you might run out of memory. You can try adjusting workspace size to see if that helps

narendasan avatar Mar 19 '24 21:03 narendasan