TensorRT
TensorRT copied to clipboard
❓ [Question] Is there support for optional arguments in model's `forward()`?
❓ Question
Is there support for optional arguments in model's forward()? For example, I have the following: def forward(self, x, y: Optional[Tensor] = None): where y is an optional tensor. The return result is x + y if y is provided, otherwise just x.
What you have already tried
I added a second torch_tensorrt.Input() in the input spec, then at inference time got the error:
Expected dimension specifications for all input tensors, but found 1 input tensors and 2 dimension specs
I then removed the Optional annotation and just pass in None or the actual tensor for y. When None is passed in, I got the error: RuntimeError: forward() Expected a value of type 'Tensor' for argument 'input_1' but instead found type 'NoneType'.
I also tried passing in just 1 argument for x, and got:
RuntimeError: forward() is missing value for argument 'input_1'
Environment
Build information about Torch-TensorRT can be found by turning on debug messages
- PyTorch Version (e.g., 1.0): 1.10.0+cu113
- CPU Architecture:
- OS (e.g., Linux): Ubuntu 18.04
- How you installed PyTorch (
conda,pip,libtorch, source):pip - Build command you used (if compiling from source):
- Are you using local sources or building from archives:
- Python version: 3.7.11
- CUDA version: 11.1
- GPU models and configuration: Tesla V100 with 32GB memory
- Any other relevant information:
Additional context
@lhai37 I don't think we support optional tensors at the moment. cc @narendasan. We expect inputs and outputs of a module to be torch::Tensors. Can you share how your torchscript model looks like ? I tried to convert the following to TS but torch.jit.script fails on this
class Optional(torch.nn.Module):
def __init__(self):
super(Optional, self).__init__()
def forward(self, x, y: Optional[torch.Tensor] = None):
return x + y
model = Optional()
scripted_model = torch.jit.script(model)
@peri044 Your code fails because you are doing x + None. If you add an if statement to prevent that, then it will work in TorchScript.
This issue has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days
This is not currently supported and requires the next phase collections feature (#629). The issue is we need to be able to generate the torchscript code to manage mapping from function input to tensorrt input when potentially any arbitrary input could be None.
This issue has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days
This issue has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days