minimal-llama icon indicating copy to clipboard operation
minimal-llama copied to clipboard

AttributeError: module transformers.models.llama has no attribute LLaMATokenizer

Open corranmac opened this issue 3 years ago • 4 comments

Hi I can load the model fine via model = transformers.LLaMAForCausalLM.from_pretrained("/content/drive/MyDrive/llama-13b-hf/") but Im not finding the LLaMATokenizer, so receiving the error AttributeError: module transformers.models.llama has no attribute LLaMATokenizer

corranmac avatar Mar 09 '23 00:03 corranmac

You just need to pip install sentencepiece

The error is silent from here

    try:
        if not is_sentencepiece_available():
            raise OptionalDependencyNotAvailable()
    except OptionalDependencyNotAvailable:
        pass
    else:
        from .tokenization_llama import LLaMATokenizer

galatolofederico avatar Mar 09 '23 15:03 galatolofederico

Hi I installed sentencepiece, but still got same error..

chlee29 avatar Mar 13 '23 08:03 chlee29

You need this version of transformers, I think a recent update messed something up. https://github.com/mbehm/transformers

In whatever file you're looking for, change the LLaMATokenizer to LlamaTokenizer.

GamerUntouch avatar Mar 17 '23 22:03 GamerUntouch

For right now the model should be load as: AutoModelForCausalLM also the tokenizer: AutoTokenizer

discoelysiumLW avatar Mar 22 '23 23:03 discoelysiumLW