ProgLearn
ProgLearn copied to clipboard
Contrastive learning transformer for PL Network
Background
Currently, the progressive learning network transformer is learned as a byproduct of optimizing the softmax objective loss for classification accuracy.
Contrastive loss (reference 1, reference 2) explicitly learns a transformer, penalizing samples of different classes that are close to one another (see also margin loss). This may be better suited for using kNN later, and shows state of the art accuracy.
See official implementation here.
Proposed feature: implement contrastive loss.
Validate by comparing accuracy and then compare transfer efficiency. Determine the best form of contrastive loss to use.
Prior experiments
I had fiddled with this a bit. See attempted contrastive learning implementation here and preliminary transfer efficiency benchmarks for variations in the contrastive loss layers.
Hi, I am looking into addressing this issue for NDD 2021-2022.
Hello, I'm also looking into this issue for NDD 2021