pytorch_bert
pytorch_bert copied to clipboard
Fix inverse token mask concatenation error and add support for mps devices
Thank you for your great work and generous contribution. During my reimplementation of your work I have noticed the following error:
in /bert/dataset.py
, at line 197 in method _preprocess_sentence()
, the default value of inverse_token_mask
has been set to None
, which cannot be directly appended to a List
object. With Python 3.11, running the code will result in the following error:
which can be resolved by replacing the variable's default value as an empty list []
.
Having noticed you haven't enabled mps device acceleration support for Apple Silicon devices, I have added support for it as well. It will make full use of Apple Silicon SoCs' GPU for model training and should yields much faster model training speed.