Graphormer icon indicating copy to clipboard operation
Graphormer copied to clipboard

About the results on ogbp-pcba in the process of finetune

Open zhangdan0602 opened this issue 2 years ago • 3 comments

Hi, thanks for your exciting Graphormer. I have a question when I use Graphormer v2 to pretrain on PCQM4Mv2 and finetune on ogbp-pcba or hiv. The evaluated metric ap on pcba in finetune is that '2022-03-13 12:44:53 | INFO | main | ap: 0.02584909547397629'. However, the results of leaderboard is about 0.3. The other condition is that the AUC on hiv in finetune is about 0.3(' | INFO | main | auc: 0.32)' while it is 0.8 on OGB. Thus, I want to know the reason. Maybe the hyper-parameter is not matched or the max-epoch is too small? Thank you very much.

zhangdan0602 avatar Mar 13 '22 04:03 zhangdan0602

Thanks for using Graphormer.

For PCBA, if this feature is urgent for you, please kindly click the thumb up reaction at this https://github.com/microsoft/Graphormer/issues/70, and we will promote the priority.

For Hiv, would you provide your python environment and all instructions to obtained your result? If your instructions are correct, it should be at least 0.8 AUC on Hiv. see #90 .

zhengsx avatar Mar 14 '22 08:03 zhengsx

Thank you, I have tried graphormer to pretrain and finetune in v2.0 last week. This week, I try v1.0 to pretrain and finetune on ogbp-molhiv and pcba, and obtain expected results.

zhangdan0602 avatar Mar 18 '22 04:03 zhangdan0602

Hi, thanks for your work. For PCBA, I can only get 'pcqm4mv1_graphormer_base_for_molhiv' for pre-training. However, its num-classes is 1, while the num-classes of PCBA is 128. I would like to ask if the pre-trained model 'pcqm4mv1_graphormer_base_for_molhiv' could train PCBA and how to use it. Thanks very much.

LUOyk1999 avatar Nov 07 '22 17:11 LUOyk1999