matbench icon indicating copy to clipboard operation
matbench copied to clipboard

add megnet and SchNet

Open ardunn opened this issue 4 years ago • 6 comments

previous results got corrupted, need to redo the experiment

also should add SchNet, as others are interested in it

ardunn avatar Oct 10 '21 07:10 ardunn

tagging @chc273 here based on email thread

ardunn avatar Nov 23 '21 21:11 ardunn

I can add MegNet and SchNet results on the benchmarks using kgcnn. Would that work for MatBench? I opened a pull request with first training result.

PatReis avatar Sep 13 '22 11:09 PatReis

Yes that would be great! Especially if you are planning on submitting for multiple tasks! Have you already run the training and benchmarking or is this prospective?

Thanks, Alex

On Sep 13, 2022 at 4:08 AM -0700, Patrick Reiser @.***>, wrote:

I can add MegNet and SchNet results on the benchmarks using kgcnn. Would that work for MatBench? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

ardunn avatar Sep 16 '22 18:09 ardunn

Yes, I have run SchNet on all structure benchmarks but wihout hyper-optimization. I created pull request #184 I will run MegNet and DimeNet++ next.

PatReis avatar Sep 18 '22 15:09 PatReis

I added MegNet benchmark in pull request #187 . Quick question: If I were to rerun the training with optimized hyperparameter but same model/version, should I overwrite the existing folder or add completely new benchmark?

PatReis avatar Sep 26 '22 19:09 PatReis

@PatReis I would suggest submitting an entirely new benchmark, but make it very clear in the long description why a new benchmark was submitted and what the difference is between this benchmark and the previous version.

Some other suggestions: Also make it very clear in writing and in dict form what the hyperparameters are and by what kind of method they were determined. If it was by a method outside of what was already published with MEGNet, make sure to cite that. Adding in what kind of hardware it was trained on might be of use as well (and/or just reporting the time taken for training on each of the training folds as hyperparameter info when you execute .record)

Also, thank you so much for the work on this! It is much appreciated!

ardunn avatar Sep 26 '22 21:09 ardunn