Language_Understanding_based_BERT
Language_Understanding_based_BERT copied to clipboard
基于BERT的预训练语言模型实现,分为两步:预训练和微调。目前已包括BERT、Roberta、ALbert三个模型,且皆可支持Whole Word Mask模式。
Results
1
Language_Understanding_based_BERT issues
Sort by
recently updated
recently updated
newest added
您在掩码中文词时的实现如下: ``` for index in index_set: covered_indexes.add(index) masked_token = None # 80% of the time, replace with [MASK] if rng.random() < 0.8: masked_token = "[MASK]" else: # 10% of the...