FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

Question on Finetune

Open xjmpeter opened this issue 2 years ago • 1 comments

Hi , @Michaelvll

Thank you release code and model.

I want to finetune llama with my local datasets on FastChat Frame.

At the beginning of finetune, the loss is 0.53. when the iteration(step) at 100, the loss increase, nearly 13. Then the loss will decrease continually.

Is this normal or unnormal?

xjmpeter avatar Apr 08 '23 10:04 xjmpeter

So long as the loss trends downward, it should be fine. You could try tweaking the learning rate to see if you can get better behavior.

Aemon-Algiz avatar Apr 08 '23 10:04 Aemon-Algiz

closing as this is not related with the development of the repo.

Please try to figure out the appropriate hyperparameters on your own dataset. Some HPO is needed!

zhisbug avatar Apr 21 '23 02:04 zhisbug