pytorch-ewc
pytorch-ewc copied to clipboard
Unofficial PyTorch implementation of DeepMind's PNAS 2017 paper "Overcoming Catastrophic Forgetting"
When we train the current task, will we use the data of the previous task? ewc need task A data to compute fisher info, when we train task B how...
As the code, we can get negative log-likelihood by F.log_softmax. How can I get the log-likelihood of regression problem??
I found that performing forward-backward in a loop is much faster than using autograd.grad with retain_graph=True. Your current code is: ``` loglikelihood_grads = zip(*[autograd.grad( l, self.parameters(), retain_graph=(i < len(loglikelihoods)) )...
Great implementations, many thanks. I'm new to the continuous learning and a little bit confused about its problem setting. I see that you generated different permuted MNIST data for every...
I am trying to run EWC on my dataset with resnet50 model. While updating the fisher matrix using your function, My code says Cuda out of memory due to "log_liklihoods.append(output[:,...
Bumps [torch](https://github.com/pytorch/pytorch) from 1.0.1.post2 to 2.2.0. Release notes Sourced from torch's releases. PyTorch 2.2: FlashAttention-v2, AOTInductor PyTorch 2.2 Release Notes Highlights Backwards Incompatible Changes Deprecations New Features Improvements Bug fixes...