CHINESE-MEDICINE-QUESTION-GENERATION
CHINESE-MEDICINE-QUESTION-GENERATION copied to clipboard
FileNotFoundError: [Errno 2] No such file or directory: '../user_data/model_data/NEZHA-Large-WWM/vocab.txt'
when i run this from bert4keras.tokenizers import Tokenizer, load_vocab
,and the system error is :No such file or directory: '../user_data/model_data/NEZHA-Large-WWM/vocab.txt', so where I can get this file
Same problem. Where to get the file vocab.txt ?
easy,just go huawei nezha to find pretrained model of tensortflow