Pre-computing design matrix given repeated approximation in loop
Hi @stevengj, in first place, thanks for the nice package. I am exploring it for my application where I need to fit an array of data points on a pre-specified grid. The status quo of my application uses linear regression and the key part of the performance is that I can easily pre-compute the regression operator/design matrix as a linear operator of my new data e.g
linear_grid = [SVector(E, C, G) for G in Float64.(G_axis) for C in Float64.(C_axis) for E in Float64.(E_axis)];
exps = collect(ExponentsIterator{Graded{LexOrder}}(ntuple(zero, d), maxdegree = P))
X = compute_X(linear_grid, exps)
beta_operator = inv(X'X)X'
I have tried your package
## smooth through Chebyshev polynomials ##
lb = SVector(minimum(E_axis), minimum(C_axis), minimum(G_axis));
ub = SVector(maximum(E_axis), maximum(C_axis), maximum(G_axis));
ord = (6, 6, 6)
cheb = chebregression(linear_grid, vals, lb, ub, ord)
but the regression layer is a bit slower since it seems like it is re-computing the design matrix every time (?) Is there not a way to save time on this step by pre-computing something of the form of beta_operator = inv(X'X)X' ?
You could call chebvandermonde(...) to construct the Chebyshev Vandermonde matrix A. Then you could either precompute the QR factorization QR = qr(A) (what is used by A \ Y, i.e. you can then do QR \ Y and it is fast) or the inv(X'X)X' that you learned in elementary linear algebra (but which doubles the number of digits you lose to roundoff errors etc.) See e.g. this post.
Basically just refactor this function