Practical_NLP_in_PyTorch icon indicating copy to clipboard operation
Practical_NLP_in_PyTorch copied to clipboard

the shape should be (batch_size, seq_len, embedding_dimensions)

Open JBoRu opened this issue 5 years ago • 0 comments

hello, i want to ask why the input word embeddings shape is (seq, batch_size, emb_dim)? i think it is should be  (batch_size, seq_len, emb_dim).
The batch_size represent the number of  sentense, and the seq represent the length of each sentense.

JBoRu avatar Oct 11 '19 07:10 JBoRu