continuum icon indicating copy to clipboard operation
continuum copied to clipboard

Label mapping at task > 0

Open Areeb2735 opened this issue 2 years ago • 3 comments

Hi.

Thank you for this great Library for Continual Learning datasets.

I wanted to ask regarding the label mapping when we are the task 2 of class incremental training. Let's say I am training on CIFAR100 on an increment of 10 classes in each task. The first training will have labels [0,1,2,3,4,5,6,7,8,9]. For the second task, the labels are [10,11,12,13,14,15,16,17,18,19]. Do you map these to [0,1,2,3,4,5,6,7,8,9] or use some other loss function instead of cross entropy? If you map it, can you please point out the code peice. And how do you handle this at inference time?

Thanks in Advance.

Areeb2735 avatar Oct 31 '23 09:10 Areeb2735

Hi @Areeb2735 , thanks for your question. We do not remap the class values in class incremental, but we usually extend the output layer size. You can use the loss function of your choice to train. As long as you do not need to task index for inference, you can do whatever you want and stay in the "class incremental" framework. Have a nice day

TLESORT avatar Oct 31 '23 09:10 TLESORT

Thank you.

So, by saying 'independent classifiers' in the paper, you meant, extending the output layer size?

Areeb2735 avatar Oct 31 '23 09:10 Areeb2735

And this makes sense when you are using a rehearsal buffer. What if we do not use the data from the previous task? I guess we wound be require to make a new classifier. And then we wound need to make the labels between 0 and 9. Please do let me know if I am wrong.

Areeb2735 avatar Oct 31 '23 11:10 Areeb2735