Pytorch-PCGrad icon indicating copy to clipboard operation
Pytorch-PCGrad copied to clipboard

Pytorch reimplementation for "Gradient Surgery for Multi-Task Learning"

Results 6 Pytorch-PCGrad issues
Sort by recently updated
recently updated
newest added

Hi, thanks for your work! I'm wondering whether this repository supports amp? or how can I do it?

Setting reduction = 'sum' does not work because of [this line](https://github.com/WeiChengTseng/Pytorch-PCGrad/blob/e987ac603fa1accd386820a985a6dc2fd92dec5b/pcgrad.py#L58): ``` if self._reduction: merged_grad[shared] = torch.stack([g[shared] for g in pc_grad]).mean(dim=0) ``` because if reduction is a string, self._reduction is...

Thanks for the simple and elegant implementation! I tried running your code as is, on Multi-MNIST data, and failed to reproduce results. I ran main_multi_mnist.py without changing any hyper parameter...

hi! i have tried the pytorch-PCGrad in my project. my network was an mobilenetv2 followed by two task-head and these two task-head have different parameters. when i ran the pytorch-pcgrad,...

**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] **Describe the solution you'd...

您好!当使用单个GPU运行代码时没有问题,但是使用多个GPU运行时出现了一下问题: File "/home/ps/workplace/ruijia.yang/MoonHunter/libs/mmcv/mmcv/runner/base_runner.py", line 309, in call_hook getattr(hook, fn_name)(self) File "/home/ps/workplace/shuang.wang/MoonHunter/apps/nn/../../project/optimizer/optimizer_pc.py", line 267, in after_train_iter pc_optimizer.pc_backward(runner.outputs["head_loss"], self.G,self.pcG) File "/home/ps/workplace/shuang.wang/MoonHunter/apps/nn/../../project/optimizer/optimizer_pc.py", line 58, in pc_backward grads, shapes, has_grads,grads_dict = self._pack_grad(objectives) File "/home/ps/workplace/shuang.wang/MoonHunter/apps/nn/../../project/optimizer/optimizer_pc.py",...