Apurba Bose
Apurba Bose
@Fulitcher I see this error. Please note that I am skipping the lines ``` checkpoint = torch.load(model_path) #remove key name string of nn.DataParallel new_state_dict = {} for k, v in...
Ok got it. Looks like I am getting a different error where the no of outputs differs from the output dtypes. Need to look into this further. I commented out...
So when I run your model, I see that there is mismatch betweem the no of outputs and the dtype, since there are symints and symints dependent ops which are...
Ok you have mentioned it above. Let me try with the above versions
I could repro with the above versions. Working on fix
This PR https://github.com/pytorch/TensorRT/pull/3513/files should address the above. Could you once try with this and the latest torchTRT release?
I see lint error here would reformat /home/runner/work/TensorRT/TensorRT/py/torch_tensorrt/dynamo/lowering/passes/constant_folding.py which is unrelated to my change. Not sure why is it failing here
Reproed. Looking into this.
Seems like the cat converter is receiving an empty tensor. In the above code the sparse_embedding is ``` sparse_embeddings = torch.empty( (1, 0, self.embed_dim), device=self._get_device() ) ``` which results in...
@edition3234 what is the value you are passing? The error seems to be mismatch in the input dimension and the reshape dimension you want. Could you please provide a simple...