FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

How to finetune mistral-7b with fastchat?

Open Jerry-hyl opened this issue 7 months ago • 0 comments

When I use fastchat to finetune llama2, everything is ok. But when I want to finetune mistral, it shows that "transformer layer not found". I know the main reason is that the value of --fsdp_transformer_layer_cls_to_wrap should not be LlamaDecoderLayer. But what should it be set? Hope you can solve my problem. Many thanks!

Jerry-hyl avatar Jul 22 '24 10:07 Jerry-hyl