VGCN-BERT icon indicating copy to clipboard operation
VGCN-BERT copied to clipboard

Very long training time !!!

Open Al-Dailami opened this issue 4 years ago • 4 comments

Hello, I'm trying to train the model, but it takes 3 days for one epoch!!!! Is that normal?

I'm using dateset that contain about 100,000 records of short text.

Can you tell me your device configuration ?!

Al-Dailami avatar Feb 23 '21 13:02 Al-Dailami

Hi,

I think that is not normal. I used Tesla K40c for this paper, it depends the size of the data set, but in my impression, training SST-2 takes up to 1 day to complete 20 epochs.

Louis-udm avatar Feb 23 '21 14:02 Louis-udm

Thanks for your reply. Can you tell me whats wrong.. I just feed my dataset to the model just like the other datasets such as SST-2.

Al-Dailami avatar Feb 24 '21 00:02 Al-Dailami

It's probably that you have a very big graph. You can find ways to reduce the vocaburary and delete some edges from the graph.

Louis-udm avatar Feb 24 '21 00:02 Louis-udm

@Al-Dailami I implemented a new VGCN-BERT version, much faster. https://huggingface.co/zhibinlu/vgcn-bert-distilbert-base-uncased

Louis-udm avatar Jul 04 '23 00:07 Louis-udm