ML-GCN
ML-GCN copied to clipboard
Corrected the code
Code wasn't running so i debugged the code and now its working (tested on torch version 1.7.1 )
-
async=True
is deprecated therefor usednon_blocking=True
@take2rohit , hi! Have you managed to reproduce the results of the paper (or at least close to it)? I tried several times in this repository and tried to integrate this approach (ML GCN model) into my project, but the model hasn't been trained (VOC dataset, the metric is about 50-60 mAP)
Hi @kprokofi . I haven't tried reproducing results. I was just using this code as a boilerplate for other code.
@take2rohit , hi! Have you managed to reproduce the results of the paper (or at least close to it)? I tried several times in this repository and tried to integrate this approach (ML GCN model) into my project, but the model hasn't been trained (VOC dataset, the metric is about 50-60 mAP)
Hi @kprokofi , did you manage to reproduce the results afterwards? Every time I train, the mAP always rise up to only 10-20.
@take2rohit , hi! Have you managed to reproduce the results of the paper (or at least close to it)? I tried several times in this repository and tried to integrate this approach (ML GCN model) into my project, but the model hasn't been trained (VOC dataset, the metric is about 50-60 mAP)
I tried to reproduce the result, but the mAP is noly about 10. Could you provide me with the training command?
https://github.com/kprokofi/ML-GCN - I couldn't reproduce the author's result, but it is got better. 93+ mAP
https://github.com/kprokofi/ML-GCN - I couldn't reproduce the author's result, but it is got better. 93+ mAP
Hi @kprokofi,
Any advise apart from the standard configs that you adopt? Appreciate advices, thanks.
https://github.com/kprokofi/ML-GCN - I couldn't reproduce the author's result, but it is got better. 93+ mAP
Hi @kprokofi,
Any advise apart from the standard configs that you adopt? Appreciate advices, thanks.
Gradient clipping was the bottleneck. Also, you could play with it and the learning rate.
https://github.com/kprokofi/ML-GCN - I couldn't reproduce the author's result, but it is got better. 93+ mAP
Hi @kprokofi, Any advise apart from the standard configs that you adopt? Appreciate advices, thanks.
Gradient clipping was the bottleneck. Also, you could play with it and the learning rate.
Could you provide the command for training? Thanks.
pretrained=True、lr=0.01,I got 93.4 mAP
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
excuse me ,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Appreciate! Now the training process looks good!
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Appreciate! Now the training process looks good!
In VOC2007, ResNet101+GMP also achieves desirable results(93.*) In MS-COCO2014, the mAP is 83.0.
Hi,
How could I load my own dataset to test with the pre-trained model?
Thanks.
Refer to this repository for more information. https://github.com/yu-gi-oh-leilei/ML-GCN_cvpr2019/blob/main/data/init.py
Refer to this repository for more information. https://github.com/yu-gi-oh-leilei/ML-GCN_cvpr2019/blob/main/data/init.py
thanks for your help.
Hello, I am very interested in this project and would like to know how to train and test on my own data set and finally output visual results
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Appreciate! Now the training process looks good!
In VOC2007, ResNet101+GMP also achieves desirable results(93.*) In MS-COCO2014, the mAP is 83.0.
How to set pretrained = True
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Appreciate! Now the training process looks good!
In VOC2007, ResNet101+GMP also achieves desirable results(93.*) In MS-COCO2014, the mAP is 83.0.
How to set pretrained = True
def gcn_resnet101(num_classes, t, pretrained=False, adj_file=None, in_channel=300):
model = models.resnet101(pretrained=True) # set pretrained = True
return GCNResnet(model, num_classes, t=t, adj_file=adj_file, in_channel=in_channel)
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Appreciate! Now the training process looks good!
In VOC2007, ResNet101+GMP also achieves desirable results(93.*) In MS-COCO2014, the mAP is 83.0.
How to set pretrained = True
def gcn_resnet101(num_classes, t, pretrained=False, adj_file=None, in_channel=300): model = models.resnet101(pretrained=True) # set pretrained = True return GCNResnet(model, num_classes, t=t, adj_file=adj_file, in_channel=in_channel)
Thank you so much!
Hi, I tried to reproduce the VOC2007 project but found that if I train the model from scratch, the mAP will rise to 17 and keep still after 50 epochs. However, if I use the pre-trained model, and using the command provided from Github. The mAP will reach 90 in about 5 epochs quickly. So I wonder, how can I obtained that pretrained model, and is it possible to reproduce the author's result by training from scratch?
hi,i have tried to run this code recently but there are some problem when i run this on my computer can you help me?
I was running this on the server, what kind of problem did you meet?
pretrained=True
Appreciate! Now the training process looks good!
In VOC2007, ResNet101+GMP also achieves desirable results(93.*) In MS-COCO2014, the mAP is 83.0.
How to set pretrained = True
def gcn_resnet101(num_classes, t, pretrained=False, adj_file=None, in_channel=300): model = models.resnet101(pretrained=True) # set pretrained = True return GCNResnet(model, num_classes, t=t, adj_file=adj_file, in_channel=in_channel)
Thank you so much! Refer to this repository for more information. https://github.com/yu-gi-oh-leilei/ML-GCN_cvpr2019/blob/main/data/init.py https://github.com/yu-gi-oh-leilei/Multi-label-Image-Recognition