Chinese-LLaMA-Alpaca
Chinese-LLaMA-Alpaca copied to clipboard
text-generation-webui部署时报错Can't find config.json at 'loras\['chinese-alpaca-lora-13b']'
loras里本来就没有config.json文件怎么会报这个错呢?? (llama.cpp-master) d:\tools\text-generation-webui>python server.py --model llama_13b_hf --lora chinese-alpaca-lora-13b --gpu-memory 6 Gradio HTTP request redirected to localhost :) Loading llama_13b_hf... Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 3/3 [03:36<00:00, 72.13s/it] Loaded the model in 218.80 seconds. Applying the following LoRAs to llama_13b_hf: chinese-alpaca-lora-13b ---------['chinese-alpaca-lora-13b']----------- Traceback (most recent call last): File "D:\python3\llama.cpp-master\lib\site-packages\peft\utils\config.py", line 99, in from_pretrained config_file = hf_hub_download(pretrained_model_name_or_path, CONFIG_NAME) File "D:\python3\llama.cpp-master\lib\site-packages\huggingface_hub\utils_validators.py", line 112, in _inner_fn validate_repo_id(arg_value) File "D:\python3\llama.cpp-master\lib\site-packages\huggingface_hub\utils_validators.py", line 157, in validate_repo_id raise HFValidationError(f"Repo id must be a string, not {type(repo_id)}: '{repo_id}'.") huggingface_hub.utils._validators.HFValidationError: Repo id must be a string, not <class 'pathlib.WindowsPath'>: 'loras['chinese-alpaca-lora-13b']'.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "d:\tools\text-generation-webui\server.py", line 919, in
同问
peft版本是多少?
后面我合并模型后再次运行就可以了
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.