stanford_alpaca
stanford_alpaca copied to clipboard
Exception: Could not find the transformer layer class to wrap in the model.
Traceback (most recent call last):
File "/root/train.py", line 231, in
transformers installed from https://github.com/huggingface/transformers/pull/21955
same issue here
wonder if you could execute these two with python which is instructed by https://github.com/huggingface/transformers/pull/21955
tokenizer = transformers.LLaMATokenizer.from_pretrained("/output/path/tokenizer/")
model = transformers.LLaMAForCausalLM.from_pretrained("/output/path/llama-7b/")
I discovered that I couldn't, so I consider it might be due to the installation of that forked transformers...
Found possible solution here https://github.com/tatsu-lab/stanford_alpaca/issues/58#issuecomment-1472042086