huilong-chen
Results
2
issues of
huilong-chen
### Is there an existing issue for this? - [X] I have searched the existing issues ### Current Behavior 现在我对p-tuning/modeling_chatglm.py中第850行代码起添加了如下代码: ` if self.pre_seq_len is not None: for param in self.parameters():...
 My CUDA version is 11.2, so I can't install Flash Attention on my machine. I try to set use_flash_attn as False when executing fine-tune.py, I meet this error be...