avalanche
avalanche copied to clipboard
distillation loss is zero in LwF plugin for multi task module
Hi,
It is related to #https://github.com/ContinualAI/avalanche/issues/1116
I think there is still something wrong because the dist_loss is zero.
It seems that for the case where isinstance(self.prev_model, MultiTaskModule) is True, task_id in self.prev_classes: is always False and thus dist_loss is always zero.
I think because the task_id is int whereas in self.prev_classes, the keys are str
Please let me know if that's the case or I am missing something there.
Thanks, Woj