Jianfei Yu

Results 10 comments of Jianfei Yu

Hi there, It is natural that when you run the same code with the same parameter setting on different servers, the results are slightly different, especifically on relatively small datasets....

Hi there, Thanks for your interest in our paper! I am not sure which MT-BERT-CRF you are referring to. But if it is a pre-trained model, you can first add...

Hi there, ‘self_attention‘ stands for the self attention layer for the auxiliary task (in the left channel of Fig.2.a), whereas ‘self_attention_v2’ stands for the self attention layer for our main...

It is the concatenation operator.

Hi there, Yes, the dataset set provided through the link is constructed for another sentiment analysis task (https://github.com/jefferyYu/TomBERT), which is quite different from the MNER task here. Note that the...

Yes, we will clean up the codes, and release them soon.

你好, 如果实在下载不了,可以私信通过其他方式获取数据集。

Hi there, Sorry for the missing file. I just uploaded it to the repo. Best, Jianfei

印象中应该是11GB左右。如果跑不起来,可以把batch_size设置小一点。