unilm icon indicating copy to clipboard operation
unilm copied to clipboard

ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name.

Open jack-gits opened this issue 3 years ago • 4 comments

I'm trying to finetune the layoutlmv3 model following the instruction https://github.com/microsoft/unilm/tree/master/layoutlmv3. pop the following error message:

ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name.

below is the error stack: python -m torch.distributed.launch
[2]+ Killed python -m torch.distributed.launch --nproc_per_node=8 --master_port 4398 examples/run_xfund.py --data_dir data --language zh --do_train --do_eval --model_name_or_path microsoft/layoutlmv3-base-chinese --output_dir path/to/output --segment_level_layout 1 --visual_embed 1 --input_size 224 --max_steps 1000 --save_steps -1 --evaluation_strategy steps --eval_steps 20 --learning_rate 7e-5 --per_device_train_batch_size 2 --gradient_accumulation_steps 1 --dataloader_num_workers 2 (wd: /data/SourceCodes/unilm/layoutlmft) (wd now: /data/SourceCodes/unilm/layoutlmv3)

--nproc_per_node=8 --master_port 4398 examples/run_xfund.py \
--data_dir data --language zh \
--do_train --do_eval \
--model_name_or_path microsoft/layoutlmv3-base-chinese \
--output_dir path/to/output \
--segment_level_layout 1 --visual_embed 1 --input_size 224 \
--max_steps 1000 --save_steps -1 --evaluation_strategy steps --eval_steps 20 \
--learning_rate 7e-5 --per_device_train_batch_size 2 --gradient_accumulation_steps 1 \
--dataloader_num_workers 2

/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/launch.py:186: FutureWarning: The module torch.distributed.launch is deprecated and will be removed in future. Use torchrun. Note that --use_env is set by default in torchrun. If your script expects --local_rank argument to be set, please change it to read from os.environ['LOCAL_RANK'] instead. See https://pytorch.org/docs/stable/distributed.html#launch-utility for further instructions

FutureWarning, WARNING:torch.distributed.run:


Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed.


Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtractionTraceback (most recent call last):

File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. Traceback (most recent call last): File "examples/run_xfund.py", line 14, in from layoutlmft.data import DataCollatorForKeyValueExtraction File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/init.py", line 1, in from .models import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/init.py", line 1, in from .layoutlmv3 import ( File "/data/SourceCodes/unilm/layoutlmv3/layoutlmft/models/layoutlmv3/init.py", line 16, in AutoConfig.register("layoutlmv3", LayoutLMv3Config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 755, in register CONFIG_MAPPING.register(model_type, config) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py", line 465, in register raise ValueError(f"'{key}' is already used by a Transformers config, pick another name.") ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. ERROR:torch.distributed.elastic.multiprocessing.api:failed (exitcode: 1) local_rank: 0 (pid: 4018) of binary: /home/ubuntu/anaconda3/envs/layoutlmv3/bin/python Traceback (most recent call last): File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/runpy.py", line 85, in _run_code exec(code, run_globals) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/launch.py", line 193, in main() File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/launch.py", line 189, in main launch(args) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/launch.py", line 174, in launch run(args) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/run.py", line 713, in run )(*cmd_args) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/launcher/api.py", line 131, in call return launch_agent(self._config, self._entrypoint, list(args)) File "/home/ubuntu/anaconda3/envs/layoutlmv3/lib/python3.7/site-packages/torch/distributed/launcher/api.py", line 261, in launch_agent failures=result.failures, torch.distributed.elastic.multiprocessing.errors.ChildFailedError:

examples/run_xfund.py FAILED

Failures: [1]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 1 (local_rank: 1) exitcode : 1 (pid: 4019) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html [2]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 2 (local_rank: 2) exitcode : 1 (pid: 4020) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html [3]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 3 (local_rank: 3) exitcode : 1 (pid: 4021) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html [4]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 4 (local_rank: 4) exitcode : 1 (pid: 4022) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html [5]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 5 (local_rank: 5) exitcode : 1 (pid: 4023) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html [6]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 6 (local_rank: 6) exitcode : 1 (pid: 4026) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html [7]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 7 (local_rank: 7) exitcode : 1 (pid: 4027) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html

Root Cause (first observed failure): [0]: time : 2022-09-14_12:18:27 host : ip-172-29-152-106.cn-northwest-1.compute.internal rank : 0 (local_rank: 0) exitcode : 1 (pid: 4018) error_file: <N/A> traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html

jack-gits avatar Sep 14 '22 12:09 jack-gits

Are you trying to fine-tune it on a benchmark dataset or on your own dataset?

SaraAmd avatar Sep 23 '22 16:09 SaraAmd

I am finetuning it on the benchmark dataset, but received exact same error as jack-gits received above. I used the Train command for finetuning FUNSD in the https://github.com/microsoft/unilm/tree/master/layoutlmv3. Any suggestions?

shenw000 avatar Oct 12 '22 17:10 shenw000

Just dig deeper :) ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. means that you already have model/module with the same name registered, probably because in newest version of Transformers layoutlmv3 is already implemented: https://huggingface.co/docs/transformers/model_doc/layoutlmv3 if you want to use version from this repository you have to just rename all LayoutLMv3Tokenizer, LayoutLMv3Model etc. and everything will work.

bernerprzemek avatar Oct 14 '22 16:10 bernerprzemek

Just dig deeper :) ValueError: 'layoutlmv3' is already used by a Transformers config, pick another name. means that you already have model/module with the same name registered, probably because in newest version of Transformers layoutlmv3 is already implemented: https://huggingface.co/docs/transformers/model_doc/layoutlmv3 if you want to use version from this repository you have to just rename all LayoutLMv3Tokenizer, LayoutLMv3Model etc. and everything will work.

transformer's previous version without LayoutLMV3 is working fine.

lxsarker avatar Oct 21 '22 14:10 lxsarker

Thanks @jack-gits and @shenw000 for reporting this. Thanks @bernerprzemek and @lxsarker for the replies! I hope the issue has been addressed.

HYPJUDY avatar Nov 03 '22 15:11 HYPJUDY