AttributeError: 'BertTokenizer' object has no attribute 'tokens_trie'
While loading trained tokenizer from pkl file i am getting this error.
PreTrainedTokenizer(name_or_path='', vocab_size=50000, model_max_len=1000000000000000019884624838656, is_fast=False, padding_side='right', truncation_side='right', special_tokens={'unk_token': '[UNK]', 'sep_token': '[SEP]', 'pad_token': '[PAD]', 'cls_token': '[CLS]', 'mask_token': '[MASK]'})
tokenizer = pickle.loads(tf.io.gfile.GFile(tokenizer_path, 'rb').read())
AttributeError: 'BertTokenizer' object has no attribute 'tokens_trie'
Could you maybe share some code on how to reproduce the issue starting from an existing tokenizer ?
Currently it's hard to understand what's going on.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.