mindnlp
mindnlp copied to clipboard
Easy-to-use and high-performance NLP and LLM framework based on MindSpore, compatible with models and datasets of 🤗Huggingface.
实现了bigbird_pegasus模型在databricks/databricks-dolly-15k数据集上的微调实验。 任务链接在https://gitee.com/mindspore/community/issues/IAUPBF transformers+pytorch+4060的benchmark是自己编写的,仓库位于https://github.com/outbreak-sen/bigbird_pegasus_finetune 更改代码位于llm/finetune/bigbird_prgasus,只包含mindnlp+mindspore的 实验结果如下 # bigbird_pegasus模型微调对比 ## train loss 对比微调训练的loss变化 | epoch | mindnlp+mindspore | transformer+torch(4060) | | ----- | ----------------- | ------------------------- | | 1 | 2.0958 |...
之前提交pr太久没合并,把后面做的任务commit传上来了,之前的pr已通过审核可合并:https://github.com/mindspore-lab/mindnlp/pull/1957 ,因此修改后重新提交pr
gitee地址:https://gitee.com/mindspore/community/issues/IAADJV
https://gitee.com/mindspore/community/issues/IAADLW
https://gitee.com/mindspore/community/issues/IAADCA
集成测试部分还存在一点问题,数据精度达不到
【开源实习】SigLIP模型迁移 issue:https://gitee.com/mindspore/community/issues/IAZ2TQ
mimi模型迁移