stable-diffusion
stable-diffusion copied to clipboard
when I freeze some layer in UNet I saw the following error.
I add some code in ddpm to freeze crossattention layer like following:
if without_crossattn:
for m in self.modules():
if isinstance(m, CrossAttention):
for para in m.parameters():
para.requires_grad=False
and I face the following error.
One of the differentiated Tensors does not require grad error