RNNLogic icon indicating copy to clipboard operation
RNNLogic copied to clipboard

slow training with GPU

Open nitishajain opened this issue 3 years ago • 2 comments

Hello,

Thank you for providing the code of your paper. As per the instructions, I am running the code for Version 2 of RNNLogic with emb. While the training is running as expected, it is very slow for both wn18rr and FB15K-237 datasets on my GPU server. Could you inform about your experimental setup for these experiments in terms of the underlying hardware and the expected run times? I could estimate the running times for my setup from this information.

Thanks!

nitishajain avatar Sep 04 '21 12:09 nitishajain

Hello, I am facing the same problem when trying re-implementing RNNLogic using the code in the main branch. I found that using multiprocessing package to concurrently train the model for each relation cannot speed up since a single process will cost almost 50% of my CPU (Intel Xeon Gold 5220). Did you face the same problem? Approximately how long did you cost to train on FB15k-237 or other much smaller datasets like umls/kinship?

chenxran avatar Jan 10 '22 16:01 chenxran

Thanks for your interest, and very sorry for the late response. We have refactored the codes, and the new codes are in the folder RNNLogic+, which are more readable and easier to run. You might be interested. Thanks!

mnqu avatar May 02 '22 16:05 mnqu