AttributeError: module transformers.models.llama has no attribute LLaMATokenizer
Hi I can load the model fine via model = transformers.LLaMAForCausalLM.from_pretrained("/content/drive/MyDrive/llama-13b-hf/") but Im not finding the LLaMATokenizer, so receiving the error AttributeError: module transformers.models.llama has no attribute LLaMATokenizer
You just need to pip install sentencepiece
The error is silent from here
try:
if not is_sentencepiece_available():
raise OptionalDependencyNotAvailable()
except OptionalDependencyNotAvailable:
pass
else:
from .tokenization_llama import LLaMATokenizer
Hi I installed sentencepiece, but still got same error..
You need this version of transformers, I think a recent update messed something up. https://github.com/mbehm/transformers
In whatever file you're looking for, change the LLaMATokenizer to LlamaTokenizer.
For right now the model should be load as: AutoModelForCausalLM also the tokenizer: AutoTokenizer