Llama-Chinese
Llama-Chinese copied to clipboard
执行python quick_start.py始终报错
PS D:\tools\100AIGC\Llama-Chinese> python quick_start.py Traceback (most recent call last): File "D:\tools\100AIGC\Llama-Chinese\quick_start.py", line 4, in <module> model = AutoModelForCausalLM.from_pretrained('Llama3-Chinese-8B-Instruct',device_map=device_map,torch_dtype=torch.float16,load_in_8bit=True,trust_remote_code=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\1\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\models\auto\auto_factory.py", line 550, in from_pretrained model_class = get_class_from_dynamic_module( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\1\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\dynamic_module_utils.py", line 489, in get_class_from_dynamic_module final_module = get_cached_module_file( ^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\1\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\dynamic_module_utils.py", line 315, in get_cached_module_file modules_needed = check_imports(resolved_module_file) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\1\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\dynamic_module_utils.py", line 180, in check_imports raise ImportError( ImportError: This modeling file requires the following packages that were not found in your environment: flash_attn. Run
pip install flash_attn``
按照这个提醒,安装flash_attn S D:\tools\100AIGC\Llama-Chinese> pip install flash-attn Collecting flash-attn Using cached flash_attn-2.5.7.tar.gz (2.5 MB) Installing build dependencies ... done Getting requirements to build wheel ... error error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "C:\Users\1\AppData\Local\Programs\Python\Python312\Lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py", line 353, in
note: This error originates from a subprocess, and is likely not a problem with pip. error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully. │ exit code: 1 ╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
GPT查了下,说是python版本问题,我的是python312
在Mac M1平台上,可以重现这个问题。即使注释掉 use_flash_attention_2=True,也还会问缺少 flash_attn 组件。
mac m1怎么搞呢
同问 macm1 怎么搞
这边下载了flash_attn之后由于使用的是V100,不支持flash_attn,之后把True改成False可以通过,只不过非常之慢