practical-pytorch
practical-pytorch copied to clipboard
What is Batch Size in RNN?
Hello I just find interesting thing in rnn name classification tutorial.
To make a word we join a bunch of those into a 2D matrix <line_length x 1 x n_letters>.
That extra 1 dimension is because PyTorch assumes everything is in batches - we're just using a batch size of 1 here.
-print(line_to_tensor('Jones').size())
--torch.Size([5, 1, 57])
what is batch size in here and what is the effect? Is it to make faster processing since we consider for example batch size is 2 so there will be 2 data being processed in single run? Is there any order, for example in HMM(Hidden Markov Model) we have high order which means we process more than 1 previous input. Is it possible do that?, or the batch size actually tell the order? -Thank you-
Batches are independent - to process more in parallel and also to mitigate effects of outliers in the dataset (batch normalization). They don't relate to order of inputs.