fwbc

Results 12 comments of fwbc

> SmartEdit did you make it?I meet the same problem

> * pip install flash-attention==2.7.3 --no-build-isolation ![Image](https://github.com/user-attachments/assets/9130cd7b-27c3-4b0d-93a5-aa90a8f3f8cc)run pip install flash-attention==2.7.3 --no-build-isolation but "Could not find a version that satisfies the requirement flash-attention==2.7.3 (from versions: 1.0.0)"

> cuda version: 12.8 system: Ubuntu 20.04.6 LTS > > * Problem: when I try to build flash-attention by "pip install . --no-build-isolation". > `note: This error originates from a...

> [@euyis1019](https://github.com/euyis1019) [email protected] 请问您还方便吗,想咨询您一些问题

收到,十分感谢 ruyihe ***@***.***

> > > 我成功了 2 个 24g 4090 > > > > > > Hi, can you show your training hyperparameter settings?Thank you! > > 我遇到的问题主要是微调过的模型推理有bug 我把我的训练推理代码放在这里了 https://github.com/mycfhs/lisa-finetune 如果我想基于他的lisa模型进行微调是不是直接把train_ds.py的version改为lisa模型即可?

> 你用的是我的代码吗?我代码记得把这个bug改了 你去issue搜一下有解决方法 > > > > 发自我的iPhone > […](#) sorry,我解决了,thanks

> I used RTX 4090 24G, but always got an error when operate multi_tensor as same as you,alway Out Of Memory when I run train code .I think maybe his...

> I always encounter assertion errors when trying to reproduce the results of llava_v1.5. In this section, may I ask if there is any solution to this problem? /dataset.py/ if...

这个好像和我之前问你的那个问题是一样的,你在llava中的converstion.py指定默认对话模板为llava_v1,这个是因为他的模板不对,导致他的seq错误,然后用seq分割就错误了