xtuner
xtuner copied to clipboard
训练intermLM2_5_7b报错
[32mpt-p1wvaor8-worker-0 logs:[0m 07/05 15:59:30 - mmengine - WARNING - Dataset Dataset has no metainfo. ``dataset_meta`` in visualizer will be None.
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: Traceback (most recent call last):
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/home/linjunpeng/reranker_llm/xtuner/xtuner/tools/train.py", line 360, in <module>
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: main()
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/home/linjunpeng/reranker_llm/xtuner/xtuner/tools/train.py", line 356, in main
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: runner.train()
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/mmengine/runner/_flexible_runner.py", line 1182, in train
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: self.strategy.prepare(
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/mmengine/_strategy/deepspeed.py", line 381, in prepare
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: model = self.build_model(model)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/mmengine/_strategy/base.py", line 306, in build_model
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: model = MODELS.build(model)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/mmengine/registry/registry.py", line 570, in build
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: return self.build_func(cfg, *args, **kwargs, registry=self)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/mmengine/registry/build_functions.py", line 232, in build_model_from_cfg
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: return build_from_cfg(cfg, registry, default_args)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/mmengine/registry/build_functions.py", line 121, in build_from_cfg
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: obj = obj_cls(**args) # type: ignore
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/home/linjunpeng/reranker_llm/xtuner/xtuner/model/sft.py", line 84, in __init__
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: llm = self._dispatch_lm_model_cfg(llm, max_position_embeddings)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/home/linjunpeng/reranker_llm/xtuner/xtuner/model/sft.py", line 225, in _dispatch_lm_model_cfg
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: llm_cfg = AutoConfig.from_pretrained(
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 974, in from_pretrained
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: config_class = get_class_from_dynamic_module(
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 514, in get_class_from_dynamic_module
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: return get_class_in_module(class_name, final_module)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: File "/mnt/data/conda/envs/llmreranker/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 213, in get_class_in_module
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: return getattr(module, class_name)
[32mpt-p1wvaor8-worker-0 logs:[0m [rank2]: AttributeError: module 'transformers_modules.internlm2_5-7b.configuration_internlm2' has no attribute 'InternLM2Config'
确认intermLM2.5为当前hf最新版本
transformers版本4.42.1,更换4.41.0后正常。