iumyx2612
iumyx2612
**Describe the feature** Able to visualize the Effective Receptive Field (ERF) of a given layer  Picture taken from Segformer paper: https://arxiv.org/pdf/2105.15203.pdf **Motivation** I want to know why my model...
**Describe the feature** Add Cross-Iteration Batch Normalization in: https://arxiv.org/abs/2002.05712 And Accumulate Gradient for training: https://github.com/WongKinYiu/ScaledYOLOv4/blob/yolov4-large/train.py#L77 Cross-Iteration BN helps model with small batch-size to achieve better results. And Accumulate Gradient helps...
# Describe the feature **Motivation** A clear and concise description of the motivation of the feature. I want to know why my experiments fail **Related resources** Pytorch grad-cam: https://github.com/jacobgil/pytorch-grad-cam MMDetection...
Why GIoULoss still used in GFL? To my understanding, GFL contains QFL and DFL. QFL for joint classification and IoU score while DFL for bbox regression  So DFL is...
I trained autoalbument with 11 epochs on my dataset and I think that's enough (the Average Parameter change oscillating with small amount). The output config has an augmentation with really...
How do I resume training?
Hello I'm building a multi task classification, my model contains 1 feature extractor and 2 classification heads, so when implementing `__getitem()__` it needs to return 1 image with 2 labels...
`ATSSHead` is inherited from `AnchorHead`, which uses `anchor_generator` of type `AnchorGenerator`. However, `FCOSHead` is inherited from `AnchorFreeHead`, which uses `anchor_generator` of type `MlvlPointGenerator`, and this is the Anchor-free version. There...
SPPCSP is more computation heavy than normal SPP, is it better in term of AP?
Why did you choose Layer Attention instead of normal Channel Attention? Task-interactive features are concatenated after **N** consecutive Conv layers, then using Channel Attention could further separate each channels to...