EliverQ
EliverQ
Hi, @guedes-joaofelipe! Thank you for your issue, but we can't reproduce the problem here. So could you please check your dataset and your environment again?
Hi, @deklanw. Thanks for your suggestion! We'll consider it in the future version.
确实是使用了lora训练的,这里因为最初在整理的时候看到cheinses llama的时候看到他先进行了中文继续预训练,然后又指令微调,在指令微调的时候使用了lora,并不是只进行了轻量化微调,所以就没有标了;下次更新的时候我们会考虑修改一下这个地方。 谢谢您的提问!
Thanks for your correction! We'll fix these problems in the next version.
As stated in the paper, our discussion is restricted to models containing over 10 billion parameters due to space limitations. However, we have included a "List of LLMs" section on...
Thanks for your suggestions! We'll add Falcon in the upcoming update, and we will also gradually address the issue of "besides". We would like to include you in the acknowledgments....
Thank you for your suggestion! We' ll add it to Figure 1 in the upcoming version.
Thanks for your suggestion! But would you like to provide us with the link of its paper, technical report or blog? I'm sorry that I can't find a model named...
Thank you very much for your recognition of our work. We will make revisions in the next version. We would like to include you in the acknowledgments. Could you please...
Yes, we are preparing these results in machine readable format. However, this part of the work is quite tedious, so you may need to wait for a while. Thank you...