TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

🐛 [Bug] Missing value if a Non-Tensor value is used in a sub-block

Open bowang007 opened this issue 3 years ago • 0 comments

Bug Description

In this graph:

%1: Int[] = prim::ListConstruct(%w, %h)
prim::If (%0):
block0():
%3 = str = aten::format(%3, %4, %1)
block1():
...

In partitioning stage, prim::If is unsupported, so we run BFS to check if any of the values consumed by prim::If are NonTensor. However, prim::If()->input() only contains %0 here. As a result, the ListConstruct node won't fallback here, and since ListConstruct is supported node, which results in the ListConstruct node in a TRT segment, and %1 is missing in shape analysis stage because of this line: https://github.com/pytorch/TensorRT/blob/d1768aa3d2c7d7d91d9f061e3e5dc5f976124dfe/core/partitioning/partitioning.cpp#L201 So, there are 2 solutions for this bug:

  1. find all the values used in prim::If, if any of them is NonTensor and used in prim::If sub-block, it should fallback.
  2. get the dependency nodes for the sub-blocks in prim::If and insert these nodes into the sub-block.

bowang007 avatar Sep 06 '22 21:09 bowang007