siyang liu

Results 16 comments of siyang liu

我也遇到这个问题希望有人解答下

> > 已解决 发自我的iPhone > > […](#) > > ------------------ 原始邮件 ------------------ 发件人: lsy641 [[email protected]](mailto:[email protected]) 发送时间: 2019年7月10日 19:27 收件人: macanv/BERT-BiLSTM-CRF-NER [[email protected]](mailto:[email protected]) 抄送: charlesfufu [[email protected]](mailto:[email protected]), Author [[email protected]](mailto:[email protected]) 主题: 回复:[macanv/BERT-BiLSTM-CRF-NER] 如何变训练边输出损失loss? (#175)...

Ok, I will think adding Albert model in it

> ![image](https://user-images.githubusercontent.com/4970790/87944053-79e13780-cad1-11ea-9fc4-70abff761f00.png) > > According to the unit-tests report, the performance of Albert seems abnormal, too. It seems turbo albert's performance is randomly higher or lower than torch?

How to fix it? implement a sum by hand or update onnx?

What do you need me to do to help this merge?

> I read the paper, is there any code available which showcases the algorithm? @RubyBit Hello. I am currently organizing the codes but I am able to add you to...

Thank you for reminding us