JointIDSF
JointIDSF copied to clipboard
BERT-based joint intent detection and slot filling with intent-slot attention mechanism (INTERSPEECH 2021)
Traceback (most recent call last): File "main.py", line 139, in main(args) File "main.py", line 11, in main tokenizer = load_tokenizer(args) File "/usr/local/xww/articles/codes/JointIDSF-main/utils.py", line 43, in load_tokenizer return MODEL_CLASSES[args.model_type][2].from_pretrained(args.model_name_or_path) File "/home/slave4/anaconda3/envs/wpy38/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py",...
Dear, I run the shell script JointIDSF for both use_intent_context_concat/attention but both yielded bad results ? Am I missing something, thanks. 