DiGress icon indicating copy to clipboard operation
DiGress copied to clipboard

How long does the code run in your GPU server?

Open FairyFali opened this issue 2 years ago • 5 comments

FairyFali avatar Apr 12 '23 21:04 FairyFali

For what dataset do you want to know the runtime?

cvignac avatar May 23 '23 09:05 cvignac

For what dataset do you want to know the runtime?

such as QM9 and the longest time in any datasets you test.

FairyFali avatar Jun 17 '23 03:06 FairyFali

It's very approximate, but it's in the order of 6h for QM9, 2 days for planar, and one week for sbm, guacamol and moses.

I'm not sure that all models had converged, though. For example, it was very easy to beat all previous methods on planar, so we did not run the model for as long as the SBM one.

cvignac avatar Jun 21 '23 15:06 cvignac

I seem to have similar statistics after 3k epochs: Sampling statistics {'spectre': 0.0158276848560015, 'clustering': 0.213101743441406, 'orbit': 0.05987776607931439, 'planar_acc': 0.0, 'sampling/frac_unique': 1.0, 'sampling/frac_unique_non_iso': 1.0, 'sampling/frac_unic_non_iso_valid': 0.0, 'sampling/frac_non_iso': 1.0}

Here is the training curve: image

If your planar acc is still 0 after a long time of training, check that the package versions match the one in the latest commit.

cvignac avatar Jun 28 '23 07:06 cvignac

Hi, I would like to know which types of GPU did you use in training these dataset, including small and large dataset. Thanks!

xinyangATK avatar Apr 26 '24 02:04 xinyangATK