juanps90

Results 12 comments of juanps90

I understand that alpha=2 should still allow for 4.5k or 5k token length (which it was failing to do), right? Also, I wonder what the relationship between alpha_value and max_seq_len...

Thank you for your reply. Yes, the LoRA is freshly trained on v2 and works great up to 4k. Have you tried using a LoRA with ntk and exllama?