torchsummaryX icon indicating copy to clipboard operation
torchsummaryX copied to clipboard

support pytorch embedding?

Open kangkang59812 opened this issue 6 years ago • 6 comments

some error in torchsummary.

kangkang59812 avatar May 20 '19 02:05 kangkang59812

The RNN example includes an embedding, so this package works for normal embeddings.

If you found an error perhaps you can expand on what it was and how to replicate it?

wassname avatar Jun 06 '19 00:06 wassname

@wassname

from torch import nn
from torchsummary import summary
embedding = nn.Embedding(10, 3) 
summary(embedding.cuda(),(2,4))

RuntimeError: Expected tensor for argument #1 'indices' to have scalar type Long; but got torch.cuda.FloatTensor instead (while checking arguments for embedding)

embedding in pytorch need Long type. But in summary, it gives float tensor.

kangkang59812 avatar Jun 06 '19 02:06 kangkang59812

Hi. I'm sorry, I had been very busy recently. I'll look how to handle it today or tomorrow and let you know after finding the way. Thanks!

nmhkahn avatar Jun 06 '19 02:06 nmhkahn

@ nmhkahn no worries, I thought I'd help you handle it

@kangkang59812 I think it's because you need to provide a tensor not a shape (unlike torchsummary). So try:

from torch import nn
from torchsummaryX import summary
embedding = nn.Embedding(10, 3) 
summary(embedding.cuda(), torch.zeros((2,4)).cuda())

And it should work.

wassname avatar Jun 06 '19 03:06 wassname

Got this error when I ran the above piece of code:

TypeError: rand(): argument 'size' must be tuple of ints, but found element of type Tensor at pos 2

sagjounkani avatar Nov 17 '19 18:11 sagjounkani

Try this code on the latest version please:

import torch
from torch import nn
from torchsummaryX import summary
embedding = nn.Sequential(nn.Embedding(10, 3) )
summary(embedding, torch.zeros((2,4)).long())

If that doesn't work, please post the full error stack and what versions of torch and torchsummaryX you are using, so we can replicate it.

wassname avatar Nov 17 '19 23:11 wassname