BERT-pytorch
BERT-pytorch copied to clipboard
Google AI 2018 BERT pytorch implementation
Fix require_grad typos (should be requires_grad). Before the fix, the code doesn't cause any errors but doesn't do what it's supposed to do. Also see https://github.com/pytorch/benchmark/pull/1771
想利用自己微调的模型生成向量 但是加载模型应该如何加载呢?之前用tf加载时需要3个文件,谢谢!
the GELU class is available in PyTorch (https://pytorch.org/docs/stable/generated/torch.nn.GELU.html). No need to implement it in utils/GELU.py
change position.py file to debug the case : pe[:, 1::2].size(-1) is less than div_term.size(-1) when d_model is an odd number
Hello @codertimo , it is a nice repo. If I want to fine tune pretrained bert model, such as [BERT-Base, Uncased](https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip) by google, how can I do it?
Can you tell me which dataset you used to train the model?
#### 1. **Summary:** In this pull request, several AI features and techniques are added to the BERT training script, so as to enhance the training process. Some of the changes...
Corrected "Trasnformer" to "Transformer" in [README.md]. This pull request addresses a minor typo found in repository. The typo has been corrected to improve clarity and maintain the quality of the...