torch_pso icon indicating copy to clipboard operation
torch_pso copied to clipboard

Unit tests to ensure that calculating the gradients does not affect on Particle Swarm Optimization

Open qthequartermasterman opened this issue 2 years ago • 0 comments

Is your feature request related to a problem? Please describe. PSO algorithms are generally gradient-free, so performing back propagation and zeroing gradients should have no effect on PSO steps. We need unit tests to ensure this behavior.

Describe the solution you'd like Such a unit test should set the seeds for torch's random generator, clone the parameters to be trained, and run the optimizer on one without any gradient calculation, and the other by performing the backward method on the loss tensor being fed into the optimizer.

Additional context Related to #1: Most checks on which parameters to adjust or not in most gradient-based optimizers consider whether or not the gradient is calculated upon calling the optimizer step function. That behavior is directly contradictory (I believe) to this behavior. An alternative could be to simply check if the parameter requires grad. More research is needed.

Related to #17: This is the issue that prompted the need for a unit test.

qthequartermasterman avatar Aug 29 '22 15:08 qthequartermasterman