SuperGluePretrainedNetwork icon indicating copy to clipboard operation
SuperGluePretrainedNetwork copied to clipboard

About homography pretraining for outdoor scenes

Open zenmedou opened this issue 4 years ago • 5 comments

Hi, thank you for your great work!

I'm trying to train superglue+superpoint on MegaDepth dataset. As is described in the paper, the weights for outdoor scenes are initialized with homography model due to limited scene number. I'm wondering how much dose this homography pretraining affect the final results. Is it possible to obtain similar results if I train the model from scratch on megadepth or a larger outdoor dataset with more scenes?

zenmedou avatar Sep 07 '20 03:09 zenmedou

After further investigation it turns out that the homography pretraining is maybe not necessary if more scenes are used (with a split similar to DISK), the model is trained for longer with a slower learning rate decay, and positives and negatives are more carefully balanced. As such, I have recently obtained good results training from scratch on MegaDepth. I will be able to release more details later on.

sarlinpe avatar Sep 09 '20 22:09 sarlinpe

That's great! Looking forward to seeing more details.

zenmedou avatar Sep 11 '20 03:09 zenmedou

@zenmedou Hello, I also recently reproduced the training process of superpoint+superglue on megadepth, but encountered some problems, may I ask if it is convenient for you to give me an email address or qq, I would like to consult with you, thank you very much

ZhouAo-ZA avatar Oct 20 '20 02:10 ZhouAo-ZA

After further investigation it turns out that the homography pretraining is maybe not necessary if more scenes are used (with a split similar to DISK), the model is trained for longer with a slower learning rate decay, and positives and negatives are more carefully balanced. As such, I have recently obtained good results training from scratch on MegaDepth. I will be able to release more details later on.

I am very interested in training from scratch on MegaDepth. In the process of trying, I encountered the following problems: 1) when 'batch_size = 8', training from scratch do not converge, end with loss ≈ 0.5 And when i initialized with your released pretrained model, 'superglue_indoor/outdoor', training loss normally converges to 0.1 (higher batch_size bring out similar result)) 2)when 'batch_size = 1', training from scratch converge well, end with loss ≈ 0.1. Since the special mini-batch for batchnorm, the training mode ('superglue.train()') should be use for tesing. In this case, is batchnorn consistent with instancenorm? What is the reason for the above phenomenon? Is it related to the learning rate? or requied a special data balence?

archershot avatar Apr 03 '21 03:04 archershot

After further investigation it turns out that the homography pretraining is maybe not necessary if more scenes are used (with a split similar to DISK), the model is trained for longer with a slower learning rate decay, and positives and negatives are more carefully balanced. As such, I have recently obtained good results training from scratch on MegaDepth. I will be able to release more details later on.

I am very interested in training from scratch on MegaDepth. In the process of trying, I encountered the following problems: 1) when 'batch_size = 8', training from scratch do not converge, end with loss ≈ 0.5 And when i initialized with your released pretrained model, 'superglue_indoor/outdoor', training loss normally converges to 0.1 (higher batch_size bring out similar result)) 2)when 'batch_size = 1', training from scratch converge well, end with loss ≈ 0.1. Since the special mini-batch for batchnorm, the training mode ('superglue.train()') should be use for tesing. In this case, is batchnorn consistent with instancenorm? What is the reason for the above phenomenon? Is it related to the learning rate? or requied a special data balence?

Could you tell me what's the which are the scenes in Megadepth that you use to do the train/evaluation? What will be the train/evaluation loss when the network converges? Thanks a lot!

lawsonxwl avatar Aug 12 '21 07:08 lawsonxwl