torch_pso icon indicating copy to clipboard operation
torch_pso copied to clipboard

Check for a key in each parameter group to ignore when adjusting parameters

Open qthequartermasterman opened this issue 2 years ago • 1 comments

A frequent use case in training parameters is leaving some parameters fixed and changing others. Currently the only way to do that in this optimizer is to feed solely trainable parameters into the optimizer at initialization. Many other optimizers contain checks that allow you to dynamically change whether a parameter group is trainable or not.

  • [ ] Research what other optimizers do to implement this functionality
  • [ ] Implement this functionality

qthequartermasterman avatar Aug 06 '22 18:08 qthequartermasterman