FANet
FANet copied to clipboard
Training Details
Hello once again,
I tried creating a training model and FANet-18 with cityscapes dataset. I replaced the InPlaceABN layers with normal BN followed by Activation as I needed to parse the trained model to ONNX for deploying in my application.
These are my training configurations highly adapted from the paper:
- Mini-Batch SGD with Batch Size
4
as I only have 8 gigs of GPU Memory, weight decay =5e-4
, momentum =0.9
- Initial Learning Rate(LR) =
1e-2
, with update to LR multiplied by a factor(1-(iter/max_iter)pow(2))
- Data Augmentation - Horizontal Flipping, random scaling (
0.75 to 2
) - Training iterations
80000
I resulted with a OHEM Cross Entropy loss of 0.3941
in the final iteration
I am yet to check the mIOU.
As a preliminary discussion I would like to compare it with BiseNet which was trained in a similar fashion but with auxiliary losses and resulted with a OHEM Cross Entropy Loss of 0.2947
which resulted in a mIOU of 0.63
Could you please give me more details on the training especially
- What is the number of Iterations you trained the model for?
- What was the Final Cross Entropy loss you ended up with?
- Did you use Auxiliary losses as well for better Convergence resulting in lower loss?
- Did you have any specific modified version of Cross Entropy Loss to achieve better convergence?
Is there any other thing that I am missing out to achieve better results