Practical_NLP_in_PyTorch
Practical_NLP_in_PyTorch copied to clipboard
the shape should be (batch_size, seq_len, embedding_dimensions)
hello, i want to ask why the input word embeddings shape is (seq, batch_size, emb_dim)? i think it is should be (batch_size, seq_len, emb_dim).
The batch_size represent the number of sentense, and the seq represent the length of each sentense.