NanoLLM icon indicating copy to clipboard operation
NanoLLM copied to clipboard

Error found when using hf api

Open Jiopro opened this issue 9 months ago • 1 comments

When I tried to call :

llm = NanoLLM.from_pretrained(
   model="TinyLlama/TinyLlama-1.1B-Chat-v1.0",
   api='hf',                              
   api_token='mytoken',               
   quantization='q4f16_ft',           
)

I got:

Traceback (most recent call last): File "/root/nanollm.py", line 6, in llm = NanoLLM.from_pretrained( File "/opt/NanoLLM/nano_llm/nano_llm.py", line 74, in from_pretrained model = HFModel(model_path, **kwargs) File "/opt/NanoLLM/nano_llm/models/hf.py", line 19, in init super(HFModel, self).init(**kwargs) TypeError: NanoLLM.init() missing 1 required positional argument: 'model_path'

Here is the original code from nano_llm/models/hf.py:

class HFModel(NanoLLM):
    """
    Huggingface Transformers model
    """
    def __init__(self, model_path, load=True, init_empty_weights=False, **kwargs):
        """
        Initializer
        """
        super(HFModel, self).__init__(**kwargs)

The issue in the initial code is that the model_path argument is not being passed to the super class __init__ method. The corrected code snippet ensures that model_path is correctly passed to the NanoLLM class during initialization. Here is the corrected version of the class definition:

class HFModel(NanoLLM):
    """
    Huggingface Transformers model
    """
    def __init__(self, model_path, load=True, init_empty_weights=False, **kwargs):
        """
        Initializer
        """
        super(HFModel, self).__init__(model_path, **kwargs)

Hope this helps!

Jiopro avatar May 20 '24 08:05 Jiopro