ColossalAI
ColossalAI copied to clipboard
[BUG]: AttributeError: 'LlamaForCausalLM' object has no attribute 'module'
🐛 Describe the bug
When I save model, have error:
Traceback (most recent call last):
File "train_sft.py", line 190, in <module>
train(args)
File "train_sft.py", line 160, in train
trainer.save_model(path=args.save_path, only_rank0=True)
File "/home/ec2-user/ColossalAI/applications/Chat/coati/trainer/sft.py", line 157, in save_model
self.strategy.save_model(model=self.model, path=path, only_rank0=only_rank0)
File "/home/ec2-user/ColossalAI/applications/Chat/coati/trainer/strategies/ddp.py", line 83, in save_model
model = model.model.module
File "/home/ec2-user/anaconda3/envs/coati/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1269, in __getattr__
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'LlamaForCausalLM' object has no attribute 'module'
The command I used as follows:
torchrun --standalone --nproc_per_node=8 train_sft.py \
--pretrain "decapoda-research/llama-7b-hf" \
--model 'llama' \
--strategy ddp \
--log_interval 10 \
--save_path /model/Coati-7B \
--dataset data/instinwild_en.json \
--batch_size 4 \
--accimulation_steps 1 \
--lr 2e-5 \
--max_epochs 3 \
--lora_rank 8 \
Environment
No response
Hi @mynewstart We have fixed it, please refer to #3475 and #3334 . Thanks.