ALBERT-Pytorch
ALBERT-Pytorch copied to clipboard
Add 80% mask, 10% random in n-gram MLM
trafficstars
In ALBERT(Lan at el), There is not detail about 80% mask

But, from n-gram masking (Joshi et al., 2019), they said about 80/10/10
As in BERT, we also mask 15% of the tokens in total: replacing 80% of the masked tokens with [MASK], 10% with random tokens and 10% with the original tokens. However, we perform this replacement at the span level and not for each token individually; i.e. all the tokens in a span are replaced with [MASK]or sampled tokens