Learning-Loss-for-Active-Learning icon indicating copy to clipboard operation
Learning-Loss-for-Active-Learning copied to clipboard

Which performance is better between confidence only and learning loss

Open williamwfhuang opened this issue 3 years ago • 4 comments

Hi Mephisto, I have a concern need to be discussed with you.

As title, I don't know which is better on different setting.

If we just to add the weak img by confidence of pure network for training in every new cycle, maybe we can get same as or better than this paper experiment result.

Have you tried this setting of experiment?

Best regards. William

williamwfhuang avatar Jan 06 '21 02:01 williamwfhuang

Can you clarify each word of 'weak img by confidence'??? I couldn't get it.

Mephisto405 avatar Jan 06 '21 03:01 Mephisto405

Sorry for confusing you. Because the aim of "Learning loss" is according to the loss of the loss module to filter the weak feature of image for currently model, then add that for image of weak feature of this model in each lift cycle. So the principle may same as directly using the confidence behind Softmax. I don't know which is better and what's different with this experiment. Have you once tried the experiment by "confidence behind softmax" ?

williamwfhuang avatar Jan 07 '21 09:01 williamwfhuang

Weak img by confidence is meaning that high loss image for currently model ~ Buddy

williamwfhuang avatar Jan 07 '21 09:01 williamwfhuang

Aha, I understand now. Unfortunately, I have not tested that kind of approach. Not sure, but I think you can find some papers that use the approach since it is straightforward.

Mephisto405 avatar Jan 07 '21 16:01 Mephisto405