Learning-Loss-for-Active-Learning
Learning-Loss-for-Active-Learning copied to clipboard
Which performance is better between confidence only and learning loss
Hi Mephisto, I have a concern need to be discussed with you.
As title, I don't know which is better on different setting.
If we just to add the weak img by confidence of pure network for training in every new cycle, maybe we can get same as or better than this paper experiment result.
Have you tried this setting of experiment?
Best regards. William
Can you clarify each word of 'weak img by confidence'??? I couldn't get it.
Sorry for confusing you. Because the aim of "Learning loss" is according to the loss of the loss module to filter the weak feature of image for currently model, then add that for image of weak feature of this model in each lift cycle. So the principle may same as directly using the confidence behind Softmax. I don't know which is better and what's different with this experiment. Have you once tried the experiment by "confidence behind softmax" ?
Weak img by confidence is meaning that high loss image for currently model ~ Buddy
Aha, I understand now. Unfortunately, I have not tested that kind of approach. Not sure, but I think you can find some papers that use the approach since it is straightforward.