torch_pso
torch_pso copied to clipboard
Check for a key in each parameter group to ignore when adjusting parameters
A frequent use case in training parameters is leaving some parameters fixed and changing others. Currently the only way to do that in this optimizer is to feed solely trainable parameters into the optimizer at initialization. Many other optimizers contain checks that allow you to dynamically change whether a parameter group is trainable or not.
- [ ] Research what other optimizers do to implement this functionality
- [ ] Implement this functionality