ProgLearn icon indicating copy to clipboard operation
ProgLearn copied to clipboard

Contrastive learning transformer for PL Network

Open rflperry opened this issue 4 years ago • 2 comments

Background

Currently, the progressive learning network transformer is learned as a byproduct of optimizing the softmax objective loss for classification accuracy.

Contrastive loss (reference 1, reference 2) explicitly learns a transformer, penalizing samples of different classes that are close to one another (see also margin loss). This may be better suited for using kNN later, and shows state of the art accuracy.

See official implementation here.

Proposed feature: implement contrastive loss.

Validate by comparing accuracy and then compare transfer efficiency. Determine the best form of contrastive loss to use.

Prior experiments

I had fiddled with this a bit. See attempted contrastive learning implementation here and preliminary transfer efficiency benchmarks for variations in the contrastive loss layers.

rflperry avatar Jan 28 '21 19:01 rflperry

Hi, I am looking into addressing this issue for NDD 2021-2022.

Dante-Basile avatar Oct 07 '21 17:10 Dante-Basile

Hello, I'm also looking into this issue for NDD 2021

waleeattia avatar Oct 07 '21 17:10 waleeattia