HayeonLee

Results 4 comments of HayeonLee

Hi. This error occurs due to using both retain_grad() and with torch.no_grad(). Yet, we need grad value only in `calculate_head_importance` function (`grad_ctx = ctx.grad`). except the function `calculate_head_importance`, by deactivating...

@xun6000 It can be fixed by removing 'normalize' in transforms.Compose() like this: dataset = Omniglot(task,split=split,transform=transforms.Compose([Rotate(rotation),transforms.ToTensor()])) It does not affect the peformance (accuracy). I guess the reason is that Omniglot is...

Hi, when I tried to run the generation example, a similar error occurs as below. Could you check this error? @wengong-jin code: `python preprocess.py --train ../data/polymers/train.txt --vocab ../data/polymers/inter_vocab.txt --ncpu 8...

Hi, thank you for having an interest in our paper. Could you please let me know which step such error occur? I will update the files. The below files will...