OpenPCDet icon indicating copy to clipboard operation
OpenPCDet copied to clipboard

File "/home/admin1/anaconda3/envs/plmee/lib/python3.8/site-packages/allennlp/data/token_indexers/wordpiece_indexer.py", line 345, in __init__ super().__init__(vocab=bert_tokenizer.vocab, AttributeError: 'NoneType' object has no attribute 'vocab'

Open SnowS13 opened this issue 1 year ago • 1 comments

python src/extractor_trainer.py --pretrained_bert /data/wangx/models/chinese_roberta_wwm_ext_large --bert_vocab /data/wangx/models/chinese_roberta_wwm_ext_large/vocab.txt --do_train_trigger --data_meta_dir ./data/DuEE --extractor_origin_trigger_dir ./save/DuEE/bert_large/trigger --extractor_origin_role_dir ./save/DuEE/bert_large/role --extractor_epoc 20 --extractor_batch_size 12 --extractor_train_file ./data/DuEE/train.json --extractor_val_file ./data/DuEE/dev.json Model name '/data/wangx/models/chinese_roberta_wwm_ext_large/vocab.txt' was not found in model name list (bert-base-uncased, bert-large-uncased, bert-base-cased, bert-large-cased, bert-base-multilingual-uncased, bert-base-multilingual-cased, bert-base-chinese). We assumed '/data/wangx/models/chinese_roberta_wwm_ext_large/vocab.txt' was a path or url but couldn't find any file associated to this path or url. Traceback (most recent call last): File "src/extractor_trainer.py", line 164, in bert_indexer = {'tokens': PretrainedBertIndexer( File "/home/admin1/anaconda3/envs/plmee/lib/python3.8/site-packages/allennlp/data/token_indexers/wordpiece_indexer.py", line 345, in init super().init(vocab=bert_tokenizer.vocab, AttributeError: 'NoneType' object has no attribute 'vocab' what does the matter mean?

SnowS13 avatar Apr 22 '24 14:04 SnowS13

This issue is stale because it has been open for 30 days with no activity.

github-actions[bot] avatar May 23 '24 01:05 github-actions[bot]

This issue was closed because it has been inactive for 14 days since being marked as stale.

github-actions[bot] avatar Jun 06 '24 01:06 github-actions[bot]