SupContrast icon indicating copy to clipboard operation
SupContrast copied to clipboard

can supCon loss used in multi-label classification?

Open littttttlebird opened this issue 3 years ago • 3 comments

I have a text multi-label classification task,can i use supCon loss ? supCon loss is accumulated by every label view,for example: batch data label = [[1, 0, 1], [0, 1, 1], [1, 1, 0], [0, 1, 1] ] from view label 0, positive examples = {0, 2},negative samples = {1, 3} from view label 1, positive examples = {1, 2,3}, negative samples = {0} from view label 2, positive examples = {0, 1, 2}, negative samples = {2}

is this setting here reasonable ?

littttttlebird avatar Sep 06 '21 03:09 littttttlebird

This is an interesting question! I think in this case SupCon might not be as good as simply using binary cross-entropy for each label.

HobbitLong avatar Dec 12 '21 18:12 HobbitLong

How about using multiple classifiers for each label after a shared encoder (in this case ResNet), and that the loss is the sum of multiple SupCon? It is just an idea, and I have no idea whether it works.

Ywandung-Lyou avatar Jan 07 '22 06:01 Ywandung-Lyou

Would it be sufficient to just modify the mask iand my input labels in such a way that the mask correctly identifies data points who share at least one class or is it necessary to modify more then just the mask in this loss function?

dbushpw avatar May 26 '23 08:05 dbushpw