fastllm icon indicating copy to clipboard operation
fastllm copied to clipboard

想问一下,会支持NTK-Aware Scaled RoPE 将context长度扩充到8k嘛

Open pikaqqqqqq opened this issue 1 year ago • 1 comments

NTK-Aware Scaled RoPE 允许 LLaMA 模型具有扩展的 (8k+) 上下文大小,无需任何微调,并将困惑度降低降至最低。

参考链接: https://www.reddit.com/user/bloc97 https://www.reddit.com/r/LocalLLaMA/comments/14lz7j5/ntkaware_scaled_rope_allows_llama_models_to_have/

pikaqqqqqq avatar Jul 14 '23 03:07 pikaqqqqqq

同求!感觉这个实现会很强!8K上下文太有诱惑了

amj12321 avatar Jul 14 '23 03:07 amj12321

在llama.cpp的issues里看到了类似的修改 https://github.com/ymcui/Chinese-LLaMA-Alpaca/discussions/696 是这个吗

ztxz16 avatar Jul 21 '23 15:07 ztxz16