zhaoliang1983x
Results
2
comments of
zhaoliang1983x
我遇到一个问题,“You are attempting to use Flash Attention 2.0 without specifying a torch dtype.” 跑起来的时候,报错。我想问下,qwen2-audio 需要安装那个版本的flash attn ??
抱歉,刚看到邮件,全部,现在可以已经搞好了 在 2024-10-30 09:52:20,"liufeiran" ***@***.***> 写道: qwen2-audio 微调的是哪部分,是language_model部分还是 模型全部. @Jintao-Huang — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID:...