VGCN-BERT
VGCN-BERT copied to clipboard
Very long training time !!!
Hello, I'm trying to train the model, but it takes 3 days for one epoch!!!! Is that normal?
I'm using dateset that contain about 100,000 records of short text.
Can you tell me your device configuration ?!
Hi,
I think that is not normal. I used Tesla K40c for this paper, it depends the size of the data set, but in my impression, training SST-2 takes up to 1 day to complete 20 epochs.
Thanks for your reply. Can you tell me whats wrong.. I just feed my dataset to the model just like the other datasets such as SST-2.
It's probably that you have a very big graph. You can find ways to reduce the vocaburary and delete some edges from the graph.
@Al-Dailami I implemented a new VGCN-BERT version, much faster. https://huggingface.co/zhibinlu/vgcn-bert-distilbert-base-uncased