ms-swift icon indicating copy to clipboard operation
ms-swift copied to clipboard

export problem: get_model_tokenizer_with_flash_attn() got multiple values for keyword argument 'automodel_class'

Open AlexJJJChen opened this issue 1 year ago • 1 comments

Describe the bug

CUDA_VISIBLE_DEVICES=0,1,2,3 swift export
--ckpt_dir finetune_output/checkpoint-478 --load_dataset_config true
--quant_method awq --quant_bits 4
--merge_lora true \

Traceback (most recent call last): File "/home/jianc/miniconda3/envs/benchmark-llm/lib/python3.10/site-packages/swift/cli/export.py", line 5, in export_main() File "/home/jianc/miniconda3/envs/benchmark-llm/lib/python3.10/site-packages/swift/utils/run_utils.py", line 31, in x_main result = llm_x(args, **kwargs) File "/home/jianc/miniconda3/envs/benchmark-llm/lib/python3.10/site-packages/swift/llm/export.py", line 129, in llm_export model, template = prepare_model_template( File "/home/jianc/miniconda3/envs/benchmark-llm/lib/python3.10/site-packages/swift/llm/infer.py", line 165, in prepare_model_template model, tokenizer = get_model_tokenizer( File "/home/jianc/miniconda3/envs/benchmark-llm/lib/python3.10/site-packages/swift/llm/utils/model.py", line 3921, in get_model_tokenizer model, tokenizer = get_function(model_dir, torch_dtype, model_kwargs, File "/home/jianc/miniconda3/envs/benchmark-llm/lib/python3.10/site-packages/swift/llm/utils/model.py", line 3693, in get_model_tokenizer_llava model, tokenizer = get_model_tokenizer_with_flash_attn( TypeError: swift.llm.utils.model.get_model_tokenizer_with_flash_attn() got multiple values for keyword argument 'automodel_class'

AlexJJJChen avatar Apr 29 '24 09:04 AlexJJJChen

me too

wangdong1992 avatar Jul 25 '24 09:07 wangdong1992

Currently, quantization for multimodal models is not yet supported.

Jintao-Huang avatar Aug 28 '24 03:08 Jintao-Huang