monaspace icon indicating copy to clipboard operation
monaspace copied to clipboard

Cyrillic support

Open K900 opened this issue 2 years ago • 11 comments

Title says it all, really.

K900 avatar Nov 09 '23 22:11 K900

@fourson .eval() won't help, this only changes the mode of the module. To freeze it you need to do self.loss.requires_grad_(False).

It is not possible to remove a parameter from a PyTorch nn.Module directly, and so Lightning can't provide that functionality. But you can use a trick if you really care about it: If you include your loss into a list, PyTorch won't associate it as a submodule: self.loss = [loss] (and then access it via self.loss[0] in your code). The drawback is that self.loss[0] won't be moved to the GPU automatically.

awaelchli avatar Feb 02 '24 11:02 awaelchli

@fourson What do you think about my comment above? Btw I also opened #19468 which adds a column "Mode" to the summary that shows which layers are in training mode. Your frozen layers would show up as "eval" there. Maybe that's helpful in your case.

For the reasons outlined in my previous comment, I don't think we can provide a mechanism to "exclude" parameters and modules. I believe any hacks we could do would go against PyTorch's design.

awaelchli avatar Feb 14 '24 03:02 awaelchli

@fourson What do you think about my comment above? Btw I also opened #19468 which adds a column "Mode" to the summary that shows which layers are in training mode. Your frozen layers would show up as "eval" there. Maybe that's helpful in your case.

For the reasons outlined in my previous comment, I don't think we can provide a mechanism to "exclude" parameters and modules. I believe any hacks we could do would go against PyTorch's design.

Thanks for the comment. The "eval" flag shown in the summary could be very helpful for further usage.

fourson avatar Feb 15 '24 08:02 fourson