Xingtai Lv

Results 5 comments of Xingtai Lv

Sorry, we can't find the library where `FlashAttentionWrapper` is located, could you please tell us which library it is?

Sorry for not replying in time. Flash Attention cannot be easily compatible with Open Delta LoRA. LoRA essentially splits the parameters of the model (such as q) into two matrices...

非常感谢您指出了我们的问题!我们已经修改了相关代码,您可以再次尝试一下。如果仍有问题,非常欢迎在issue中指出,我们会尽快更正。

Sorry, the current OpenDelta can't support training with 3 different deltas using one back-bone model. Sorry for not being able to help you, we will conduct further research and try...

Sorry for our negligence. The code you mentioned is indeed unintended. You can comment it out directly and we'll also fix it later. Thank you for pointing out this.