huilong-chen

Results 2 issues of huilong-chen

### Is there an existing issue for this? - [X] I have searched the existing issues ### Current Behavior 现在我对p-tuning/modeling_chatglm.py中第850行代码起添加了如下代码: ` if self.pre_seq_len is not None: for param in self.parameters():...

![image](https://github.com/dvlab-research/LongLoRA/assets/147307433/a149481e-9bc5-4389-9058-d5e0dae83aef) My CUDA version is 11.2, so I can't install Flash Attention on my machine. I try to set use_flash_attn as False when executing fine-tune.py, I meet this error be...