LLaMA-Factory
LLaMA-Factory copied to clipboard
有计划支持LoRAMoE吗?
Reminder
- [X] I have read the README and searched the existing issues.
Reproduction
https://arxiv.org/pdf/2312.09979.pdf 《LoRAMoE: Alleviate World Knowledge Forgetting in Large Language Models via MoE-Style Plugin》
Expected behavior
System Info
No response
Others
No response
如果有更新,建议再follow一下这篇工作:https://arxiv.org/abs/2402.08562《Higher Layers Need More LoRA Experts》 允许自定义设置每一层的专家个数
mark:)
mark
mark
mark
mark
mark
請問是否有更新,增加該方法