bigbird icon indicating copy to clipboard operation
bigbird copied to clipboard

reproduce arxiv classification task

Open liuyang148 opened this issue 4 years ago • 1 comments

We try to reproduce arxiv task with f1 92 as shown in the paper, we are using default hyperparameters defined in bigbird/classifier/base_size.sh, pretrained checkpoint here, but with batch size = 2 due to memory limitation (total batch size = 8gpu * 2 = 16), after 16k steps (16000 * 16 / 30034 = 8.5 epoch), but only get f1 84 in the end, which is too low compare to the paper which is trained by 10 epochs. Did we missing something? preprocessing of Arxiv? or just because of the batch size is too small? Will you release the checkpoint of Arxiv in the future?

About the difference of dataset, we have finetune roberta on the same arxiv dataset and get f1 86, pretty close the the paper.

liuyang148 avatar Aug 18 '21 07:08 liuyang148

Yes, I am experiencing a similar result. Maybe the authors forgot to remove the leaked labels in the original data of the text from the scraped PDF text? (i.e., some samples have the label directly in the text, so classifying it is trivial)

The fixed version (non-leaked labels) is on huggingface: https://huggingface.co/datasets/ccdv/arxiv-classification, but I'm not sure if the authors used this version (no_ref subset).

MonliH avatar Jan 03 '23 05:01 MonliH