IterativeSolvers.jl
IterativeSolvers.jl copied to clipboard
Missing Preconditioners
Some preconditioner choices seem missing. For example, cg only allows left preconditioners, and chebyshev only allows right preconditioners.
Hm, with regard to CG: it's a bit subtle and the current implementation is probably correct. For CG you only have a left preconditioner so that you solve MAx = Mb. You might think that MA is not symmetric & pos. def. anymore and that you need two-sided preconditioning to restore it. But there is a neat trick going on.
MA is symmetric positive definite with respect to a different inner product [x, y] := (x, M^-1 y) with (x, y) the standard inner product. Without a preconditioner CG minimizes the error in the A-norm, with a preconditioner CG minimizes the error in the (MA)-norm induced by this new inner product [x, y], but the surprising result is that this is equivalent to still minimizing the error in the A-norm with the standard inner product (the only difference being that the Krylov subspace is generated by MA and Mr_0).
As long as your preconditioner M is positive definite, you can apply it as a left preconditioner. Even if you have a decomposition like incomplete cholesky M^-1 = L * L' ≈ A, then you apply the preconditioner simply as M * A* z = L' \ (L \ (A * z)).
If you look at Matlab's pcg you see the option for M1 and M2, but in fact that method uses them both as left preconditioners as well.
I still think there should be the option, since I would expect in many cases people may be passing through options, one for left and one for right, maybe having some automatic choice of method, which now fails if it happens to be cg
If you look at Matlab's pcg you see the option for M1 and M2, but in fact that method uses them both as left preconditioners as well.
And if it's both, does it just use M1*M2 (or other way around) as a left preconditioner internally?
And if it's both, does it just use M1*M2 (or other way around) as a left preconditioner internally?
Exactly. It basically does this each iteration:
if existM1
y = M1 \ r;
else % no preconditioner
y = r;
end
if existM2
z = M2 \ y
else % no preconditioner
z = y;
end
So that's multiplication with (M2 * M1)^-1. Isn't that just a poor design choice of Matlab?
I agree it would be consistent to have both "left" and "right", but the downside is that that would suggest there is a right preconditioner, while it would really come down to left-preconditioning twice.
Another thing is that Julia's factorization types F = factorize(A) overload \, so if at some point we would have an incomplete factorization type, you would simply pass that as a single preconditioner. Compare that to Matlab where you would do
L = ichol(A);
pcg(A, b, tol, maxit, L, L')
Not sure how Matlab works internally, but I suspect that would require twice the memory and would actually compute the transpose (which is totally unnecessary).
Lastly, if consistency is required to make it easier to switch from one method to another, it could indeed be useful to add the "right" preconditioner. On the other hand there are parameters that are unique to an iterative method, so switching methods always requires some care.
Speaking of missing preconditioners: IDR(s) has documentation about Pl and Pr but I cannot find where they are used.
Same with minres.