qingying6

Results 1 issues of qingying6

**loras里本来就没有config.json文件怎么会报这个错呢??** (llama.cpp-master) d:\tools\text-generation-webui>python server.py --model llama_13b_hf --lora chinese-alpaca-lora-13b --gpu-memory 6 Gradio HTTP request redirected to localhost :) Loading llama_13b_hf... Loading checkpoint shards: 100%|█████████████████████████████████████████████████████████| 3/3 [03:36

stale