LongLM icon indicating copy to clipboard operation
LongLM copied to clipboard

What effect on qwen1.5 will be if i use self-extend trick?

Open WeixuanXiong opened this issue 10 months ago • 4 comments

Thanks for your contribution on accommondating qwen on self-extend. Qwen1.5 has already been 32k context length. I'm wondering if i can use self-extend to make it to about 100K? Have you ever tested the effect on qwen1.5 using self-extend?

WeixuanXiong avatar Mar 28 '24 08:03 WeixuanXiong

We believe how good is self extend highly depends on how good is the extended model within its original pretraining context window. This means if Qwen1.5's 32k context window is not well trained, SelfExtend may not work. Otherwise, it will work well. [Currently, we have no plan for a serious test considering the massive computational resource requirement: 32k 8x -> 256k, 4x -> 128k. We may do serious benchmarking for Qwen1.5 when we have enough resources.]

Mooler0410 avatar Mar 28 '24 18:03 Mooler0410

We believe how good is self extend highly depends on how good is the extended model within its original pretraining context window. This means if Qwen1.5's 32k context window is not well trained, SelfExtend may not work. Otherwise, it will work well. [Currently, we have no plan for a serious test considering the massive computational resource requirement: 32k 8x -> 256k, 4x -> 128k. We may do serious benchmarking for Qwen1.5 when we have enough resources.]

Ahhh, if I test it in the future work, i'll share it with you guys. Thanks for your reply~

WeixuanXiong avatar Mar 29 '24 10:03 WeixuanXiong

1789dd1fa9730b8fd239a672c501cab Results on 128k length in str (around 70k as for qwen tokenizer). It seems work!

WeixuanXiong avatar May 07 '24 10:05 WeixuanXiong

What is the scale base set to?

233function avatar Sep 23 '24 03:09 233function