aikido
aikido copied to clipboard
Consider eliminating support for zero measure TSRs
The TSR class currently supports TSRs where some dimensions have equal upper and lower bounds in Bw. This is problematic because floating point error dictates whether or not the constraint can be satisfied. It also creates the edge case of a point TSR, which has fundamentally different properties than all other TSRs (i.e. all samples from the TSR are equal).
I am inclined to require that Bw[:, 0] < Bw[:, 1] with strict inequality and to add a new "identity constraint" class to represent a singleton set (e.g. point TSR).
@siddhss5 @psigen @gilwoolee @dqyi11 Thoughts? Objections?
I am inclined not to support this change for two reasons:
-
Doesn't the same numerical tolerance problem occur when operating near one of the limits? If not, why not? It seems like any approach (i.e. projection, tolerance spec, etc) that works to prevent issues near a limit will also work with the equal bounds.
-
Many constraints that we have only use a subset of Bw dimensions (in fact, I'm not sure I can think of an example where we've ever needed all 6). This means that any TSR, regardless of usage, needs to inaccurately inflate the unused bounds by some tolerance. If we are trying to use the TSR for generative processes such as sampling or projection, where there is not an issue with the numerical stability, this actually has the opposite effect: it is going to worsen stability of everything downstream of this process by adding an epsilon variability.