cooper
cooper copied to clipboard
Constraint schedulers
Enhancement
Enable "schedulers" for the constraints.
Consider a base CMP with objective function $f$ and inequality constraints $g \le \epsilon$. The scheduler could allow the construction of a "moving target" where the constraint is gradually strengthened. One could start with a sequence of optimization problems: $$ \min_x f(x) \quad \text{s.t.} \quad g(x) \le \epsilon + \psi_t$$ such that the "slack" $\psi_t \rightarrow 0$.
Motivation
Theoretically, this sequence of optimization problems should yield equivalent solutions to that of the base CMP. However, specific (implementations of) algorithms can benefit from relaxing the optimization problem, especially towards the beginning of the optimization.
In the end, we might care about achieving an (approximately) feasible solution of the base CMP that has a good value of the objective function. Thus, there is no need to be overly strict at the beginning of training trying to achieve the "final" constraint level $\epsilon$.
"Curriculum learning" (Bengio et al., 2009) successfully the idea of gradually adjusting the problem difficulty for supervised learning tasks in ML.
Implementation proposal
Pytorch's learning rate schedulers are usually tied to a particular optimizer. For this reason they might not be directly portable for implementing constraint schedulers, but some of their scheduler framework and implementations could be re-used.
References
- Bengio, Louradour, Collobert and Weston (2009) Curriculum Learning [link to pdf]