SENT icon indicating copy to clipboard operation
SENT copied to clipboard

codes for ACL2021 paper "SENT: Sentence-level Distant Relation Extraction via Negative Training"

Results 9 SENT issues
Sort by recently updated
recently updated
newest added

Can you provide the code for using bert?

I use the code published in this repo to conduct the experiment on the dataset published by ARNOR, and the model adopted in my experiment is SENT(BiLSTM) However, I can...

Hello,in my opinion,the negative training phase(negative classes random sampling and NLLoss) could be included by CrossEntropyLoss(postive and all negative classedxs) why use negative training?

May I know how to implement the BiLSTM+BERT model? The paper mentions that they use the 'bert-base-uncased' pre-trained model, but I can not find the implementation part in the code.

why do labels have to be reassigned to their original labels at each iteration in filter_relabel `train_batch_data = self.train_dl.dataset` `for i in range(len(train_batch_data)):` `train_batch_data[i][-1][:] = self.ori_train_labels[i]` `self.train_dl = DataLoader(train_batch_data, batch_size=self.batch_size,...

Hi, thanks for your contribution. However,when i work with this code , i found there are two files i dont know how to obtain. Despite i checked the link of...

Now the uploaded code is just the experiment about "nyt-10" dataset. Could please upload another part of the experiment "noisy-TACRED" in the paper? Thank you!

When I want to run the code, I found two filed didnt exist, can you tell me how to get the "glove.6B.50d_word2id.json" and "glove.6B.50d_mat.npy"?

Hi, Is it possible for you to share the checkpoint of the model you used to report the numbers in the paper? Thanks!