torch2trt icon indicating copy to clipboard operation
torch2trt copied to clipboard

TypeError: add_shape()

Open elohimarthur opened this issue 2 years ago • 2 comments

Hi John @jaybdub, Thank you for your reply of torch.flip, I continue my research. However, I met other error:

Warning: Encountered known unsupported method torch.is_grad_enabled
Warning: Encountered known unsupported method torch.is_grad_enabled
Warning: Encountered known unsupported method torch.get_default_dtype
Warning: Encountered known unsupported method torch.get_default_dtype
Warning: Encountered known unsupported method torch.Tensor.numel
Warning: Encountered known unsupported method torch.Tensor.numel
Warning: Encountered known unsupported method torch.Tensor.has_names
Warning: Encountered known unsupported method torch.Tensor.numel
Warning: Encountered known unsupported method torch.Tensor.is_neg
Warning: Encountered known unsupported method torch.Tensor.unbind
Warning: Encountered known unsupported method torch.Tensor.__iter__
Traceback (most recent call last):
  File "/home/data1/elohim/code/Tensorrt/benchmark.py", line 50, in <module>
    model_trt = torch2trt(model, [batch])
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch2trt-0.4.0-py3.8-linux-x86_64.egg/torch2trt/torch2trt.py", line 778, in torch2trt
    outputs = module(*inputs)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/home/data1/elohim/code/Tensorrt/model.py", line 153, in forward
    x = self.encoder(x)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/container.py", line 139, in forward
    input = module(input)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/home/data1/elohim/code/Tensorrt/model.py", line 97, in forward
    y, (hn,cn) = self.lstm(x)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1148, in _call_impl
    result = forward_call(*input, **kwargs)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/rnn.py", line 767, in forward
    self.check_forward_args(input, hx, batch_sizes)
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/rnn.py", line 693, in check_forward_args
    self.check_hidden_size(hidden[0], self.get_expected_hidden_size(input, batch_sizes),
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch/nn/modules/rnn.py", line 225, in check_hidden_size
    if hx.size() != expected_hidden_size:
  File "/home/data1/elohim/software/miniconda3/envs/trt/lib/python3.8/site-packages/torch2trt-0.4.0-py3.8-linux-x86_64.egg/torch2trt/torch2trt.py", line 1053, in _size_wrapper
    shape_trt = ctx.network._network.add_shape(input._trt).get_output(0)
TypeError: add_shape(): incompatible function arguments. The following argument types are supported:
    1. (self: tensorrt.tensorrt.INetworkDefinition, input: tensorrt.tensorrt.ITensor) -> tensorrt.tensorrt.IShapeLayer

Invoked with: <tensorrt.tensorrt.INetworkDefinition object at 0x7f7b06b2a270>, <repr raised Error>

elohimarthur avatar Dec 05 '22 02:12 elohimarthur

And here is my lstm part code, I want to know what is wrong with my model to convert?

class LSTM_Wrapper(Module):
    def __init__(self, in_size, out_size, reverse = False):
        super().__init__()
        self.reverse = reverse
        self.hidden_size = out_size
        self.lstm = LSTM(in_size, out_size, batch_first=True)
        # self.lstm = LSTM(in_size, out_size)
        self.init_weight()
        self.disable_state_bias()

    def forward(self, x):
        if self.reverse == True:
            # x = x.flip(0)
            x = torch.flip(x,[0])
            y, (hn,cn) = self.lstm(x)
            # y = y.flip(0)
            y = torch.flip(y,[0])
        else:
            y, (hn,cn) = self.lstm(x)
        return y

    def init_weight(self):
        for name, param in self.lstm.named_parameters():
            if 'bias_ih' in name:
                with torch.no_grad():
                    param.set_(0.5*truncated_normal(param.shape, dtype=param.dtype, device=param.device))
            if 'weight' in name:
                for i in range(0, param.size(0), self.hidden_size):
                    orthogonal_(param[i:i+self.hidden_size])

    def disable_state_bias(self):
        for name, param in self.lstm.named_parameters():
            if 'bias_hh' in name:
                param.requires_grad = False
                param.zero_()


def truncated_normal(size, dtype=torch.float32, device=None, num_resample=5):
    x = torch.empty(size + (num_resample,), dtype=torch.float32, device=device).normal_()
    i = ((x < 2) & (x > -2)).max(-1, keepdim=True)[1]
    return torch.clamp_(x.gather(-1, i).squeeze(-1), -2, 2)

elohimarthur avatar Dec 05 '22 02:12 elohimarthur

I've also had the same problem trying to convert easyocr recognizer. Seems like torch2trt doesn't support LSTM right out the bat. I'm not tried it yet, but maybe this issue could be a good start https://github.com/NVIDIA-AI-IOT/torch2trt/issues/144.

light42 avatar Mar 08 '23 03:03 light42