DiffEqBase.jl icon indicating copy to clipboard operation
DiffEqBase.jl copied to clipboard

Update of residuals in Anderson acceleration

Open devmotion opened this issue 6 years ago • 2 comments

I'm currently copying the implementation of the Anderson acceleration (unfortunately, we really should only have one implementation...) but I'm confused about how the norm of the residuals is updated.

https://github.com/JuliaDiffEq/DiffEqBase.jl/blob/ed5bec25f95b1ef349307a6a2ee3468c485d283c/src/nlsolve/functional.jl#L159 seems to be incorrect to me, since z .- z₊ .+ dz gives only the updated dz. However, based on this new dz we should actually recompute https://github.com/JuliaDiffEq/DiffEqBase.jl/blob/ed5bec25f95b1ef349307a6a2ee3468c485d283c/src/nlsolve/functional.jl#L75-L76 (By the way, it's also inconsistent that we use tsteps for the updated residual and t for the initial one)

So I think this line should be replaced with

dz = z .- z₊ .+ dz
atmp = calculate_residuals(dz, uprev, u, integrator.opts.abstol, integrator.opts.reltol, integrator.opts.internalnorm, t)
ndz = integrator.opts.internalnorm(atmp, t)

and in the same way, of course, also the in-place method should be changed.

devmotion avatar Aug 02 '19 21:08 devmotion

@kanav99 I agree. Also, this would get rid of allocations.

ChrisRackauckas avatar Aug 03 '19 11:08 ChrisRackauckas

I'm not sure anymore if it's a good idea to update the residuals at all. I guess I assumed it would be needed for the convergence/divergence checks, but if the main purpose is to compare the (weighted) norms of f(x_k) - x_k and f(x_{k+1}) - x_{k+1} it should not be necessary?

devmotion avatar Aug 12 '19 16:08 devmotion