transformers-tutorials icon indicating copy to clipboard operation
transformers-tutorials copied to clipboard

Error when batch size is not a factor of the number of samples in transformers_multiclass_classification.ipynb

Open rachel-sorek opened this issue 5 years ago • 3 comments

re: transformers_multiclass_classification.ipynb

Thank you for this helpful tutorial!

It seems to work well when the batch size (either for training or validation) is a factor of the number of examples, but otherwise I get the following error message: IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

For example: with batch_size =4, 3172 samples works, but 3171 or 3173 return an error.

rachel-sorek avatar Jul 14 '20 06:07 rachel-sorek

@rachel-sorek , thank you for raising the request.

I did not run into this issue myself, but then again i did not tweak the dataset size or the batch size much. I will run the notebook with these changes at my end and let you know the results.

abhimishra91 avatar Aug 12 '20 02:08 abhimishra91

I faced the same issue. Would be good to have some checking mechanism to round down to the closest multiple that doesn't crash with a warning message explaining that part of the data has been discarded.

GalMoore avatar Sep 04 '20 10:09 GalMoore

re: transformers_multiclass_classification.ipynb

Thank you for this helpful tutorial!

It seems to work well when the batch size (either for training or validation) is a factor of the number of examples, but otherwise I get the following error message: IndexError: Dimension out of range (expected to be in range of [-1, 0], but got 1)

For example: with batch_size =4, 3172 samples works, but 3171 or 3173 return an error.

You may find this link helpful, when if input_ids & attention_mask are of different size then we get this error. https://stackoverflow.com/questions/65851195/runtimeerror-stack-expects-each-tensor-to-be-equal-size

I got same error after i updated pad_to_max_length=True, in Trigne class to padding=True.

When i switched to original worked fine again.

maheshmechengg avatar Nov 26 '21 09:11 maheshmechengg