Juan Ramirez
Juan Ramirez
## Enhancement When a dual restart is triggered, the dual variables are reset to their initial value 0. Nonetheless, the state of the primal and dual optimizer remains the same....
## Enhancement Implement the _extrapolation from the past_ algorithm (Popov, 1980). A good and modern source is Gidel et al. (2019). This is an algorithm for computing parameter updates similar...
## Enhancement Implement a "Multiplier Model" in Cooper. This idea comes from the [paper](https://proceedings.neurips.cc/paper/2020/hash/62db9e3397c76207a687c360e0243317-Abstract.html) by Narasimhan et al. Instead of tracking the Lagrange Multipliers of a constrained optimization problem explicitly,...
## Enhancement Pytorch optimizers include a `maximize` flag ([Pytorch Issue](https://github.com/pytorch/pytorch/issues/68052))*. When set to `True`, the sign of gradients is flipped inside `optimizer.step()` before computing parameter updates. This enables gradient ascent...
## Enhancement Enable the user to provide several partially instantiated "dual optimizers". These could be handled "as a single" optimizer by applying all operations currently applied to constrained_optimizer.dual_optimizer simultaneously across...
## Enhancement Finalize the code coverage addition. ## Motivation Code coverage action and badge are part of the Cooper codebase, but they are not fully functional yet.
Closes # ## Changes ## Testing ## References