MMM-MCQA icon indicating copy to clipboard operation
MMM-MCQA copied to clipboard

Please can the pre-training code about NLI be shared?

Open MingjieWang0606 opened this issue 4 years ago • 6 comments

I want to train more models like albert and will share it later~

MingjieWang0606 avatar Jan 12 '21 04:01 MingjieWang0606

Wow, super thanks! Look forward to them!

jind11 avatar Jan 12 '21 05:01 jind11

哇,超级谢谢!期待他们!

您好~ 能分享关于NLI的代码嘛~可能是我表达有误~ 我会接着重新训练albert和其他模型之后合并到您的git里~ 或者您有代码也可以直接发送给我~ 很高兴能看到这么出色的工作~ 我可以提供机器支持~

MingjieWang0606 avatar Jan 12 '21 05:01 MingjieWang0606

这是我的微信和邮箱~ 十分期待您的回复! wmj745000 [email protected]

MingjieWang0606 avatar Jan 12 '21 05:01 MingjieWang0606

I see, I need to find the code for NLI and will post it these two days. Thank you for the patience!

jind11 avatar Jan 12 '21 18:01 jind11

Thank you for your quick reply! Hope the code is not eaten by mice~

I see, I need to find the code for NLI and will post it these two days. Thank you for the patience!

MingjieWang0606 avatar Jan 18 '21 09:01 MingjieWang0606

Hi, I am so sorry for delay since I have been busy with my conference ddl. I just now spent half an hour digging into my hard drive to find out the code for NLI fine-tuning, but I failed, which is so weird. But I did recall that I was using the huggingface code for training the NLI model, which can be referred to this code: https://github.com/abidlabs/pytorch-transformers. I am sorry that I cannot offer a direct code but I do believe that it is easy to adopt the code in the link to fine-tuned an Albert model on NLI. Of course you can also use the latest Huggingface transformers source code. Let me know if you have more questions.

jind11 avatar Jan 31 '21 23:01 jind11