FastChat
FastChat copied to clipboard
How to finetune mistral-7b with fastchat?
When I use fastchat to finetune llama2, everything is ok. But when I want to finetune mistral, it shows that "transformer layer not found". I know the main reason is that the value of --fsdp_transformer_layer_cls_to_wrap
should not be LlamaDecoderLayer
. But what should it be set? Hope you can solve my problem. Many thanks!