alpaca-lora
alpaca-lora copied to clipboard
{'loss': 4.3273, 'learning_rate': 0.0, 'epoch': 0.13}
With the default parameters, the learning rate becomes 0?
I encountered the same problem. Try adjust the micro batch size with larger number.
@chenzk1993 the learning rate is not 0, it's a very small value. One decimal place is reserved and the specific value cannot be seen.
This may be due to hardware reasons. On some hardware, the quantization model is not compatible with fp16. You can try set fp16=False.