alpaca-lora icon indicating copy to clipboard operation
alpaca-lora copied to clipboard

{'loss': 4.3273, 'learning_rate': 0.0, 'epoch': 0.13}

Open chenzk1993 opened this issue 1 year ago • 3 comments

With the default parameters, the learning rate becomes 0?

chenzk1993 avatar Apr 02 '23 06:04 chenzk1993

I encountered the same problem. Try adjust the micro batch size with larger number.

victorzhz111 avatar Apr 04 '23 05:04 victorzhz111

@chenzk1993 the learning rate is not 0, it's a very small value. One decimal place is reserved and the specific value cannot be seen.

LiuPearl1 avatar Apr 07 '23 09:04 LiuPearl1

This may be due to hardware reasons. On some hardware, the quantization model is not compatible with fp16. You can try set fp16=False.

lyccyl1 avatar Dec 11 '23 02:12 lyccyl1