More least-squares variants
Motivation
Now that the implementation of the constrained least-squares is done (modulo the specs), I've realized we could provide two additional least-squares variants solely based on existing lapack functions, namely
Weighted least-squares
The weighted least-squares cost reads $\vert\vert D (Ax - b ) \vert\vert^2$ where $D$ is a diagonal matrix with positive entries. We could easily hijack the already existing lstsq solvers for that purpose. The function interface could look like
x = weighted_lstsq(w, A, b [, cond, overwrite_a, rank, err])
and redefine internally the A matrix and b vector as matmul(D, A) and matmul(D, b) where D = diag(sqrt(w)). A similar interface can be crafted for the subroutine version.
Generalized least-squares
The generalized least-squares cost reads $\vert\vert W^{-\frac12} (Ax - b ) ||^2$ where $W \in \mathbb{R}^{m \times m}$ is a symmetric positive definite matrix. We could obviously hijack once more the lstsq solver but lapack actually has a specialized solver for it : *GGGLM. Indeed, the problem can be rewritten as
$$ \begin{aligned} \mathrm{maximize} & \quad \vert\vert y \vert\vert^2 \ \mathrm{subject~to} & \quad b = Ax + W^\frac12 y \end{aligned} $$
The function interface could read something like
x = generalized_lstsq(W, A, b [, prefactored_w, overwrite_a, err])
where prefactored_w is a Boolean flag indicating whether the array W actually contains the whole W matrix or if its Cholesky factorization has already been computed ahead of time.
Prior Art
As far as I know neither numpy/scipy nor Julia (base) expose such routines. Definitely available in more specialized packages though.
Ping: @perazz, @jvdp1, @jalvesz. Also pinging @Beliavsky as I have a hunch he might be interested by such models.
Additional Information
No response