Yuxiang Zhou
Yuxiang Zhou
Same Issue, does anyone know how to solve that?
> @ZhouyxNLP Did that resolve? I retrained the whole thing and it didn't happen. As you know threading errors can be such. @cmkumar87 No, I didn't. I retrain many times,...
> @ZhouyxNLP Did you try running it Pytorch 3.6? You may have to update a lit bit of syntax in PreSumm/src/prepro/data_builder.py to port the legacy code to recent convention. But...
> > > @ZhouyxNLP Did you try running it Pytorch 3.6? You may have to update a lit bit of syntax in PreSumm/src/prepro/data_builder.py to port the legacy code to recent...
> I actually succcessfully installed flash-attn 2.5.7 with vllm 0.4.1 and it can be detected with vllm (Using FlashAttention backend). But the performance is remaining the same (there is not...