LLMLingua
LLMLingua copied to clipboard
[Question]: How to use a manually downloaded model
Describe the issue
Due to slow network, LLMLingua failed to download the model from huggingface during running.
requests.exceptions.ReadTimeout: (ReadTimeoutError("HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Read timed out. (read timeout=10)"), '(Request ID: a664234f-7f09-4a97-966a-aafabde074a1)')
model-00001-of-00002.safetensors: 0%| | 0.00/9.98G [05:17<?, ?B/s]
Is it possible that I download the model manually in advance and LLMLinggua load the model from a local location? If it's possible, how to to it?
Thanks in advance!
Hi @Dorish,
Absolutely, you can follow the HF documentation to first download the model to your local machine, and then use the local model path as the model_name when passing it into LLMLingua.
Hi @Dorish,
Absolutely, you can follow the HF documentation to first download the model to your local machine, and then use the local model path as the
model_namewhen passing it into LLMLingua.
Hi @iofu728 iofu728 , is not working when i pass a local model path to LLMLingua, what's wrong?
raise error:
i meet the same problem @iofu728 I think PromptCompressor may not know whether model_name is local or remote
llm_lingua = PromptCompressor(model_name=qwen_path)
ConnectionError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f07280e31f0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
i meet the same problem @iofu728 I think PromptCompressor may not know whether model_name is local or remote
llm_lingua = PromptCompressor(model_name=qwen_path)
ConnectionError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7f07280e31f0>: Failed to establish a new connection: [Errno 101] Network is unreachable'))
Have you solved this problem? I want to use LLMLingua-2 locally, so I download the model, but when I set the model_name, it still occur the /encodings/cl100k_base.tiktoken problem
I run it smoothly using local model with absolute path. You'd better check other problems instread of focusing on this project's code.