CVPR19_Incremental_Learning icon indicating copy to clipboard operation
CVPR19_Incremental_Learning copied to clipboard

Adaptive weight of the LC loss

Open wonda opened this issue 5 years ago • 4 comments

Hello, I found the calculation of the adaptive weight of less-forget constraint different from the description in the paper. Did I misunderstand this part or miss some details? https://github.com/hshustc/CVPR19_Incremental_Learning/blob/e5a90aed7640f3b00e5b1a8dfb5376c1628bfe6a/cifar100-class-incremental/class_incremental_cosine_cifar100.py#L207

wonda avatar Nov 13 '19 07:11 wonda

Hello, I think there is no problem with the adaptive weight, the out_features1+out_features2 is the old classes number in the current session and the args.nb_cl is the novel classes number in the current session. The same as the description in the paper.

But I can't understand the use of class_mean in #5, do you think there is any problem?

cxy1996 avatar Nov 27 '19 03:11 cxy1996

@cxy1996 Thanks for your reply. In eq. (7) of the paper, the number of novel classes is the numerator, but in the code it is the denominator. Maybe there is a mistake in that equation.

wonda avatar Nov 27 '19 08:11 wonda

@wonda You're right. I missed that.

cxy1996 avatar Dec 02 '19 14:12 cxy1996

Hello, I found the calculation of the adaptive weight of less-forget constraint different from the description in the paper. Did I misunderstand this part or miss some details? https://github.com/hshustc/CVPR19_Incremental_Learning/blob/e5a90aed7640f3b00e5b1a8dfb5376c1628bfe6a/cifar100-class-incremental/class_incremental_cosine_cifar100.py#L207

I also cannot understand this. And I found the result of this code for our-CNN is a little bit lower than our-NME. I don't know is it because I only run 1 run.

JoyHuYY1412 avatar Feb 05 '20 12:02 JoyHuYY1412