llama icon indicating copy to clipboard operation
llama copied to clipboard

AttributeError: module 'llama' has no attribute 'LLaMATokenizer' when using example

Open benjamin32561 opened this issue 1 year ago • 7 comments

I am using google colab to run your example code, when I run the tokenizer = llama.LLaMATokenizer.from_pretrained(MODEL) I have an error, tried running the same notebook on my laptop, same error occurs.

benjamin32561 avatar Mar 13 '23 10:03 benjamin32561

me too

srulik-ben-david avatar Mar 13 '23 13:03 srulik-ben-david

installing the environment as described in the facebook repo solves the problem: https://github.com/facebookresearch/llama

kriskrisliu avatar Mar 13 '23 15:03 kriskrisliu

Thats right! thanks.

srulik-ben-david avatar Mar 13 '23 19:03 srulik-ben-david

@kriskrisliu But it still requires to receive the weights from Meta via the submission form, am I right?

lovodkin93 avatar Apr 13 '23 09:04 lovodkin93

I am having the same problem. Any solution?

subhashree303 avatar May 22 '23 14:05 subhashree303

installing the environment as described in the facebook repo solves the problem: https://github.com/facebookresearch/llama

I tried but it doesn't.

subhashree303 avatar May 22 '23 14:05 subhashree303

It is because of the imports, it seems messed up in the current version of this repo. Change your import llama to this: from llama.tokenization_llama import LLaMATokenizer and from llama.modeling_llama import LLaMAForCausalLM

And then use it like this (thus without the llama before each class name): tokenizer = LLaMATokenizer.from_pretrained(MODEL) and model = LLaMAForCausalLM.from_pretrained(MODEL, low_cpu_mem_usage = True)

SilacciA avatar Jun 07 '23 22:06 SilacciA