Wee Tee, Soh
Wee Tee, Soh
I have updated the script such that building document-word edges is done on the fly, but this only slightly reduced RAM usage. Graph preprocessing is quite intensive as its quadratic...
Yes if anyone could share a pre-trained model on a large corpus that would be great. I could only pre-train on a small CNN corpus as a proof-of-concept. Will leave...
I have added instructions as suggested to download & unzip the data
Yes, the notations have been changed :)
Hi all, if you need to fine-tune on your own dataset other than Semeval, you will have to fork this and modify src/tasks/preprocessing_funcs.py to ingest your own data and reformat...
Hi all, yes @vabatista is right, the constructor of BertModel has been modified here from the original, in order to implement this model. If you want to use this for...
Hi there, for semeval task the model is evaluated on test data. However for pre-training, there is no test data as the pre-training is self-supervised. You were probably looking at...
Hi, what do you mean by development set?
Yes, FewRel fine-tuning have not yet been implemented, I have put as a to-do task