lightllm
lightllm copied to clipboard
[FEATURE] Load model directly from huggingface
Thanks for the project! We want to run lightllm directly in a cloud container environment where the current way to provide a model_dir is harder than providing a huggingface model id directly then the library itself can handle model downloading & loading like supported by vllm and tgi using the huggingface_hub library. Hopefully not a too complicated code change but it will help boost the portability of the project by a lot!