Zhibin (Louis) Lu
Zhibin (Louis) Lu
Hi, Thank you for your attention. If you need to apply to other bert variants, I think you need to modify the inheritance of the classes in model_vgcn_bert.py. These changes...
The dataset structure is simple. We only need the text and the label, and every line is one document. for Cola, text=df[3] and label=df[1]. You can also organize your documents...
For the multi-label, you can just refer to MNIST, using labels like 0,1,2,3,4...
> > For the multi-label, you can just refer to MNIST, using labels like 0,1,2,3,4... > > But this a multi-class, I mean the multi-label which text is assigned one...
Hi, I don't know STS_B, however, it's not difficult to adapt the model to other loss functions since the last layer is just a dense layer.
This is also a problem for me.
I need also this feature .
+1 for the reasons of debug.
原来已经有人问了,好滴,我去看看。谢谢回复
I implemented a new VGCN-BERT version, much faster. And this old algorithm is deprecated. the new version is available in HuggingFace hub: https://huggingface.co/zhibinlu/vgcn-bert-distilbert-base-uncased