Optim.jl icon indicating copy to clipboard operation
Optim.jl copied to clipboard

Docs for preconditioning confusing

Open baggepinnen opened this issue 4 years ago • 2 comments

I can't seem to figure out how to use preconditioning This line in the docs in particular is confusing me

 method=ConjugateGradient(P = precond(100), precondprep! = precond(100))

First, I guess that the keyword precondprep! should lose the !? Then, precond(100) returns a matrix, but as I understand it from this page, precondprep should be a function?

baggepinnen avatar Nov 08 '19 00:11 baggepinnen

I can't find any usage of precondprep in the tests either https://github.com/JuliaNLSolvers/Optim.jl/blob/master/test/multivariate/precon.jl I think it might be broken, when I supply a function (P,x)->P I get

MethodError: ldiv!(::Array{Float64,1}, ::Array{Float64,2}, ::Array{Float64,1}) is ambiguous. Candidates:
  ldiv!(Y::AbstractArray, A::AbstractArray, B::AbstractArray) in DiffEqBase at /home/fredrikb/.julia/packages/DiffEqBase/4V8I6/src/init.jl:5
  ldiv!(x, P::AbstractArray{T,2} where T, b) in Optim at /home/fredrikb/.julia/packages/Optim/EhyUl/src/multivariate/precon.jl:68
Possible fix, define
  ldiv!(::AbstractArray, ::AbstractArray{T,2} where T, ::AbstractArray)
update_state!(::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Optim.GradientDescentState{Array{Float64,1},Float64}, ::GradientDescent{LineSearches.InitialPrevious{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Array{Float64,2},typeof(grad)}) at gradient_descent.jl:67
optimize(::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Array{Float64,1}, ::GradientDescent{LineSearches.InitialPrevious{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Array{Float64,2},typeof(grad)}, ::Optim.Options{Float64,Nothing}, ::Optim.GradientDescentState{Array{Float64,1},Float64}) at optimize.jl:57
optimize at optimize.jl:33 [inlined]
#optimize#93(::Bool, ::Symbol, ::typeof(Optim.optimize), ::Function, ::Array{Float64,1}, ::GradientDescent{LineSearches.InitialPrevious{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Array{Float64,2},typeof(grad)}, ::Optim.Options{Float64,Nothing}) at interface.jl:116
(::Optim.var"#kw##optimize")(::NamedTuple{(:autodiff,),Tuple{Symbol}}, ::typeof(Optim.optimize), ::Function, ::Array{Float64,1}, ::GradientDescent{LineSearches.InitialPrevious{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}},Array{Float64,2},typeof(grad)}, ::Optim.Options{Float64,Nothing}) at none:0
top-level scope at natural_gradient_autotuning.jl:182

baggepinnen avatar Nov 08 '19 01:11 baggepinnen

I think you correctly identified it as https://github.com/JuliaLang/julia/issues/31278. I think the fix is clear, it's just that nobody has gotten around to it yet.

antoine-levitt avatar Nov 09 '19 09:11 antoine-levitt