cooper
cooper copied to clipboard
A general-purpose, deep learning-first library for constrained optimization in PyTorch
The documentation of this library is very limited and it is hard to parse how it should be applied for different applications. For instance, in Many RL algorithms such as...
## Enhancement A tutorial example for the generation of time series data w.r.t. to constrains on the values. An open question to me is if the inequalities below should result...
## Enhancement Pytorch optimizers include a `maximize` flag ([Pytorch Issue](https://github.com/pytorch/pytorch/issues/68052))*. When set to `True`, the sign of gradients is flipped inside `optimizer.step()` before computing parameter updates. This enables gradient ascent...
## Enhancement Checkpointing capacities were enabled in #27, but the documentation and a minimal example have not been written.
## Enhancement Enable the user to provide several partially instantiated "dual optimizers". These could be handled "as a single" optimizer by applying all operations currently applied to constrained_optimizer.dual_optimizer simultaneously across...
## Enhancement Enable "schedulers" for the constraints. Consider a base CMP with objective function $f$ and inequality constraints $g \le \epsilon$. The scheduler could allow the construction of a "moving...
## Enhancement Finalize the code coverage addition. ## Motivation Code coverage action and badge are part of the Cooper codebase, but they are not fully functional yet.
Closes # ## Changes ## Testing ## References
## Changes Hi 👋, I've implemented the **Damped Lagrangian Formulation** to enhance the stability of the optimization process by addressing oscillatory behaviors when constraints are suddenly satisfied or violated. This...