BERT-E2E-ABSA
BERT-E2E-ABSA copied to clipboard
AttributeError: 'XLNetConfig' object has no attribute 'absa_tagger_config'
Hi, thanks for the great repo!
Bert runs very well for me, however when switching to XLNET i constantly get this error.
error:
Traceback (most recent call last):
File "main.py", line 522, in <module>
main()
File "main.py", line 412, in main
config=config, cache_dir='./cache')
File "C:\Anaconda3\envs\py37_dev\lib\site-packages\transformers\modeling_utils.py", line 342, in from_pretrained
model = cls(config, *model_args, **model_kwargs)
File "\BERT-E2E-ABSA\absa_layer.py", line 488, in __init__
self.tagger_config = xlnet_config.absa_tagger_config
AttributeError: 'XLNetConfig' object has no attribute 'absa_tagger_config'
heres my train.bat:
#!/usr/bin/env bash
set TASK_NAME="rest_total"
set ABSA_TYPE="tfm"
set CUDA_VISIBLE_DEVICES=0
python main.py --model_type xlnet ^
--absa_type %ABSA_TYPE% ^
--tfm_mode finetune ^
--fix_tfm 0 ^
--model_name_or_path xlnet-base-cased ^
--data_dir ./data/sents ^
--task_name %TASK_NAME% ^
--per_gpu_train_batch_size 16 ^
--per_gpu_eval_batch_size 8 ^
--learning_rate 2e-5 ^
--do_train ^
--do_eval ^
--tagging_schema BIEOS ^
--overfit 0 ^
--overwrite_output_dir ^
--eval_all_checkpoints ^
--MASTER_ADDR localhost ^
--MASTER_PORT 28512 ^
--max_steps 2000
am running transformers 2.0.0 as per the requirements.
>>> import transformers as t
>>> t.__version__
'2.0.0'
Thank you for your attention. The implementation of XLNetABSATagger
is incomplete (and thus you cannot run normally). The reason I do not continue to add this feature is that the model based on XLNet performs quite poor on ABSA datasets. I am not sure if I miss some important details. Considering your case, you can write a new XLNetABSATagger
class following the style of BertABSATagger
and check whether it works or not.
Thank your for your Feedback!