Chinese-LLaMA-Alpaca
Chinese-LLaMA-Alpaca copied to clipboard
使用text-generation-webui运行时出错 AssertionError("Torch not compiled with CUDA enabled")
我是使用decapoda-research/llama-7b-hf的. 在输入python server.py --model llama-7b-hf --lora chinese-alpaca-lora-7b后, 出现
raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled
已经安装CUDA和pytorch 请问如何解决?
这应该是cuda和torch版本不一致导致的,建议通过torch.cuda.is_available
()确认一下
可以通过cpu运行,如下
python server.py --model llama-7b-hf --lora chinese-alpaca-lora-7b --cpu
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.
我也出现了相同的问题,首先通过对原版的LLaMA-7B模型通过手动转换的方式,转换为HF格式。 只有就按照“使用text generation webui搭建界面”的文档说明进行了操作,最后执行: python server.py --model llama-7b-hf --lora chinese-alpaca-lora-7b --cpu
项目可以顺利启动,但进行对话时,没有任何返回,并且终端报错。 raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled
我拉取了webui最新代码并尝试运行相关命令,未复现您的问题,还请检查相关环境或到webui下提一个issue,以下是我的输出及结果
(torch1.13-cpu) [/text-generation-webui]$ python server.py --model llama-7b-hf --lora chinese-alpaca-lora-7b --cpu
INFO:Gradio HTTP request redirected to localhost :)
INFO:Loading llama-7b-hf...
Loading checkpoint shards: 100%|██████████████████████████████████████████████████████████████████████████| 2/2 [00:13<00:00, 6.74s/it]
INFO:Loaded the model in 13.70 seconds.
INFO:Applying the following LoRAs to llama-7b-hf: chinese-alpaca-lora-7b
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.
Output generated in 47.67 seconds (0.99 tokens/s, 47 tokens, context 51, seed 57377600)
这是来自QQ邮箱的假期自动回复邮件。 @.***
感谢回复,问题已经解决,现在正在进行尝试
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.