lightning-bolts
lightning-bolts copied to clipboard
Loss Functions for Common Tasks
🚀 Feature
Frequently re-used losses that can be added to bolts.
Motivation
Writing Losses is quite repetitive. PyTorch supports losses which are written with deep interoperability with C++ API. But most research losses aren't. Can these go to bolts/lightning (I feel it won't be good in lightning and better in bolts).
These losses are building blocks for other complicates losses as well.
Pitch
A non-exhaustive and probable list of losses that are not PyTorch but used often.
- [ ] gIoU loss -> Used in Detr (will make porting Detr easier, present in fvcore
- [ ] focal loss -> Used in RetinaNet (It will come in torchvision as well but we we can have here too for re-use, present in fvcore
- [ ] smooth l1 loss -> Used in FRCNN and RetinaNet (In torchvision here and torch but might be helpful)
- [ ] Jsd loss -> Used alternative to CrossEntropy in Classification here
- [ ] Varients of Cross Entropy -> Unsure if they are used often found them here
- [ ] Dice Loss -> Used in U-Net and other segmentation models.
- [ ] Sigmoid Focal Loss -> Modification of Focal loss for segmentation.
- [ ] Huber loss -> Used in efficnet Det and similar loss. Implemented here
Unsure of losses in audio and text domains. Someone can add them here as well.
Alternatives
Wait for them to reach into fvcore or PyTorch. Till then we keep duplicating these code for models.
Additional context
I might be able to add PRs for these. These look migrations from repositories such as fvcore, etc. I will cite them as their implementations are used. I think that these are likely to be adapted from sources, so unsure of licensing and other policies.
cc @Borda @nateraw
I'll try to complete focal loss and smoothing loss this week #121
Triplet Loss and its variants can be helpful for metric learning. In case anyone is interested I can help.