BERT-pytorch icon indicating copy to clipboard operation
BERT-pytorch copied to clipboard

PyTorch implementation of BERT in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"

Results 4 BERT-pytorch issues
Sort by recently updated
recently updated
newest added

I'm hoping to try out your model with my custom data, but I need to get it converted to ONNX eventually, so I thought I'd try converting the simple examples...

[INFO] 2020-05-04 11:56:22 > Run name : BERT-BERT-{phase}-layers_count={layers_count}-hidden_size={hidden_size}-heads_count={heads_count}-{timestamp}-layers_count=1-hidden_size=128-heads_count=2-2020_05_04_11_56_22 [INFO] 2020-05-04 11:56:22 > {'config_path': None, 'data_dir': None, 'train_path': '/home/ubuntu/ALEX/BERT-pytorch/data/rusbiomed/train.txt', 'val_path': '/home/ubuntu/ALEX/BERT-pytorch/data/rusbiomed/val.txt', 'dictionary_path': '/home/ubuntu/ALEX/BERT-pytorch/data/rusbiomed/dict.txt', 'checkpoint_dir': '/home/ubuntu/ALEX/BERT-pytorch/data/rusbiomed/checkpoints/', 'log_output': None, 'dataset_limit': None, 'epochs': 100,...

Hey you did an awesome job.Can your code be use for training parallel corpora?Or can you tell about some other resources like se2seq or fairseq where parallel corpora can be...