Optim.jl icon indicating copy to clipboard operation
Optim.jl copied to clipboard

Type error when constructing `MultivariateOptimizationResults` when using `Float32` and `IPNewton`

Open NoFishLikeIan opened this issue 1 year ago • 2 comments

Issue

I get a MethodErrorwhen constructing the result Optim.MultivariateOptimizationResults when solving a interior point minimization problem using Float32s. The reason is that, by default, the Optim.Options are constructed using Float64 and in the optimize function for the constrained problem the types are inferred as

T = typeof(options.f_reltol)
Tf = typeof(value(d))

In my case T = Float64 and Tf = Float32 and this is not consistent with MultivariateOptimizationResults.

MWE

Just adapting the Rosenberk function example as

using Optim

fun(x) =  (1f0 - x[1])^2 + 100f0 * (x[2] - x[1]^2)^2

function fun_grad!(g, x)
    g[1] = -2f0 * (1f0 - x[1]) - 400f0 * (x[2] - x[1]^2) * x[1]
    g[2] = 200f0 * (x[2] - x[1]^2)
end

function fun_hess!(h, x)
    h[1, 1] = 2f0 - 400f0 * x[2] + 1200f0 * x[1]^2
    h[1, 2] = -400f0 * x[1]
    h[2, 1] = -400f0 * x[1]
    h[2, 2] = 200f0
end;


x0 = zeros(Float32, 2)
df = TwiceDifferentiable(fun, fun_grad!, fun_hess!, x0)

res = optimize(df, x0, Newton())

lx = -0.5f0 * ones(Float32, 2); ux = -lx
dfc = TwiceDifferentiableConstraints(lx, ux)

res = optimize(df, dfc, x0, IPNewton()) # ERROR: MethodError: no method matching Optim.MultivariateOptimizationResults(::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, ::Vector{Float32}, ::Vector{Float32}, ::Float32, ::Int64, ::Bool, ::Bool, ::Float64, ::Float64, ::Float32, ::Float32, ::Bool, ::Float64, ::Float64, ::Float32, ::Float32, ::Bool, ::Float64, ::Float32, ::Bool, ::Vector{OptimizationState{Float32, IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}}}, ::Int64, ::Int64, ::Int64, ::Nothing, ::Float64, ::Float64, ::NamedTuple{(), Tuple{}})

One straight forward solution is of course to do

optionsFloat32 = Optim.Options(
    x_abstol = 0f0,
    x_reltol = 0f0,
    f_abstol = 0f0,
    f_reltol = 0f0,
    g_abstol = 1f-8,
    g_reltol = 1f-8,
    outer_x_abstol = 0f0,
    outer_x_reltol = 0f0,
    outer_f_abstol = 0f0,
    outer_f_reltol = 0f0,
    outer_g_abstol = 1f-8,
    outer_g_reltol = 1f-8,
)

res = optimize(df, dfc, x0, IPNewton(), optionsFloat32) # * Status: success...

Solution

Would it be fine to just promote the options in the MultivariateOptimizationResults construction using Tf instead of T?

Tf(options.x_abstol) # and so on...

Version and stacktrace of error for MWE

The version is Optim v1.7.8.

ERROR: MethodError: no method matching Optim.MultivariateOptimizationResults(::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, ::Vector{Float32}, ::Vector{Float32}, ::Float32, ::Int64, ::Bool, ::Bool, ::Float64, ::Float64, ::Float32, ::Float32, ::Bool, ::Float64, ::Float64, ::Float32, ::Float32, ::Bool, ::Float64, ::Float32, ::Bool, ::Vector{OptimizationState{Float32, IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}}}, ::Int64, ::Int64, ::Int64, ::Nothing, ::Float64, ::Float64, ::NamedTuple{(), Tuple{}})

Closest candidates are:
  Optim.MultivariateOptimizationResults(::O, ::Tx, ::Tx, ::Tf, ::Int64, ::Bool, ::Bool, ::Tf, ::Tf, ::Tc, ::Tc, ::Bool, ::Tf, ::Tf, ::Tc, ::Tc, ::Bool, ::Tf, ::Tc, ::Bool, ::M, ::Int64, ::Int64, ::Int64, ::Tls, ::Float64, ::Float64, ::Tsb) where {O, Tx, Tc, Tf, M, Tls, Tsb}
   @ Optim ~/.julia/packages/Optim/V8ZEC/src/types.jl:179

Stacktrace:
 [1] optimize(d::TwiceDifferentiable{Float32, Vector{Float32}, Matrix{Float32}, Vector{Float32}}, constraints::TwiceDifferentiableConstraints{NLSolversBase.var"#139#142", NLSolversBase.var"#140#143", NLSolversBase.var"#141#144", Float32}, initial_x::Vector{Float32}, method::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, options::Optim.Options{Float64, Nothing}, state::Optim.IPNewtonState{Float32, Vector{Float32}})
   @ Optim ~/.julia/packages/Optim/V8ZEC/src/multivariate/solvers/constrained/ipnewton/interior.jl:297
 [2] optimize(d::TwiceDifferentiable{Float32, Vector{Float32}, Matrix{Float32}, Vector{Float32}}, constraints::TwiceDifferentiableConstraints{NLSolversBase.var"#139#142", NLSolversBase.var"#140#143", NLSolversBase.var"#141#144", Float32}, initial_x::Vector{Float32}, method::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol}, options::Optim.Options{Float64, Nothing})
   @ Optim ~/.julia/packages/Optim/V8ZEC/src/multivariate/solvers/constrained/ipnewton/interior.jl:229
 [3] optimize(d::TwiceDifferentiable{Float32, Vector{Float32}, Matrix{Float32}, Vector{Float32}}, constraints::TwiceDifferentiableConstraints{NLSolversBase.var"#139#142", NLSolversBase.var"#140#143", NLSolversBase.var"#141#144", Float32}, initial_x::Vector{Float32}, method::IPNewton{typeof(Optim.backtrack_constrained_grad), Symbol})
   @ Optim ~/.julia/packages/Optim/V8ZEC/src/multivariate/solvers/constrained/ipnewton/interior.jl:229

NoFishLikeIan avatar Oct 23 '23 15:10 NoFishLikeIan

Yes, a type conversion the correct place is probably the solution. Are you open of having a stab at it? Otherwise I may have time here in December to look at it. OF course you can work around it even if it's annoying to set them all.

pkofod avatar Dec 12 '23 08:12 pkofod

I will have a stab at it next week!

NoFishLikeIan avatar Dec 21 '23 08:12 NoFishLikeIan