stanford_alpaca icon indicating copy to clipboard operation
stanford_alpaca copied to clipboard

Exception: Could not find the transformer layer class to wrap in the model.

Open Cloopen-ReLiNK opened this issue 1 year ago • 3 comments

Traceback (most recent call last): File "/root/train.py", line 231, in train() File "/root/train.py", line 225, in train trainer.train() File "/root/anaconda3/envs/test/lib/python3.10/site-packages/transformers/trainer.py", line 1628, in train return inner_training_loop( File "/root/anaconda3/envs/test/lib/python3.10/site-packages/transformers/trainer.py", line 1715, in _inner_training_loop model = self._wrap_model(self.model_wrapped) File "/root/anaconda3/envs/test/lib/python3.10/site-packages/transformers/trainer.py", line 1442, in _wrap_model raise Exception("Could not find the transformer layer class to wrap in the model.") Exception: Could not find the transformer layer class to wrap in the model.

transformers installed from https://github.com/huggingface/transformers/pull/21955

Cloopen-ReLiNK avatar Mar 19 '23 11:03 Cloopen-ReLiNK

same issue here

Vivecccccc avatar Mar 19 '23 16:03 Vivecccccc

wonder if you could execute these two with python which is instructed by https://github.com/huggingface/transformers/pull/21955

tokenizer = transformers.LLaMATokenizer.from_pretrained("/output/path/tokenizer/")
model = transformers.LLaMAForCausalLM.from_pretrained("/output/path/llama-7b/")

I discovered that I couldn't, so I consider it might be due to the installation of that forked transformers...

Vivecccccc avatar Mar 19 '23 16:03 Vivecccccc

Found possible solution here https://github.com/tatsu-lab/stanford_alpaca/issues/58#issuecomment-1472042086

Vivecccccc avatar Mar 19 '23 23:03 Vivecccccc