OptimizationProblems.jl icon indicating copy to clipboard operation
OptimizationProblems.jl copied to clipboard

Add `hs67`

Open tmigot opened this issue 3 years ago • 8 comments

#113

tmigot avatar Dec 15 '22 16:12 tmigot

It misses the JuMP implementation

tmigot avatar Dec 15 '22 16:12 tmigot

Codecov Report

Base: 99.87% // Head: 99.78% // Decreases project coverage by -0.08% :warning:

Coverage data is based on head (c4e6e4d) compared to base (91ac8ce). Patch coverage: 86.66% of modified lines in pull request are covered.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #235      +/-   ##
==========================================
- Coverage   99.87%   99.78%   -0.09%     
==========================================
  Files         766      768       +2     
  Lines        7011     7056      +45     
==========================================
+ Hits         7002     7041      +39     
- Misses          9       15       +6     
Impacted Files Coverage Δ
src/Meta/hs67.jl 0.00% <0.00%> (ø)
src/ADNLPProblems/hs67.jl 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.

codecov[bot] avatar Dec 15 '22 17:12 codecov[bot]

hs67

tmigot avatar May 17 '23 03:05 tmigot

hs67-appendix

tmigot avatar May 17 '23 03:05 tmigot

Hi @odow ! Sorry to bother you, but any chance you would have a suggestion to make a JuMP model for this one?

tmigot avatar May 17 '23 03:05 tmigot

you would have a suggestion to make a JuMP model

Two options jump out:

  1. a user-defined function
  2. unroll the loop a fixed number of times

let me have a go

odow avatar May 17 '23 03:05 odow

It's not quite the same, but perhaps:

  
function yvec(x)
    y = similar(x, 8)
    yc = 1.6x[1]
    while true
        y[2] = yc
        y[3] = 1.22y[2] - x[1]
        y[6] = (x[2] + y[3]) / x[1]
        yc = 0.01x[1] * (112 + 13.167y[6] - 0.6667y[6]^2)
        if abs(yc - y[2]) <= 0.001
            break
        end
    end
    yc = 93
    while true
        y[4] = yc
        y[5] = 86.35 + 1.098y[6] - 0.038y[6]^2 + 0.325(y[4] - 89)
        y[8] = 3y[5] - 133
        y[7] = 35.82 - 0.222y[8]
        yc = 98_000x[3] / (y[2] * y[7] + 1_000x[3])
        if abs(yc - y[4]) <= 0.001
            break
        end
    end
    return y
end

k_iter = 10
a = [0, 0, 85, 90, 3, 0.01, 145, 5_000, 2_000, 93, 95, 12, 4, 162]
using JuMP, Ipopt
model = Model(Ipopt.Optimizer)
x0 = [1_745, 12_000, 110.0]
ub = [2e3, 1.6e4, 1.2e2]
@variable(model, 1e-5 <= x[i=1:3] <= ub[i], start = x0[i])
@variable(model, y[i=1:8, 1:k_iter])
@variable(model, y_actual[i=1:8])
for k in 1:k_iter
    if k == 1
        @constraint(model, y[2, 1] == 1.6x[1])
        @constraint(model, y[4, 1] == 93)
    else
        @NLconstraint(model, y[2, k] == 0.01x[1] * (112 + 13.167y[6, k-1] - 0.6667y[6, k-1]^2))
        @NLconstraint(model, y[4, k] == 98_000x[3] / (y[2, k_iter] * y[7, k-1] + 1_000x[3]))
    end
    @constraint(model, y[3, k] == 1.22y[2, k] - x[1])
    @NLconstraint(model, y[6, k] == (x[2] + y[3, k]) / x[1])
    @constraint(model, y[5, k] == 86.35 + 1.098y[6, k_iter] - 0.038y[6, k_iter]^2 + 0.325(y[4, k] - 89))
    @constraint(model, y[8, k] == 3y[5, k] - 133)
    @constraint(model, y[7, k] == 35.82 - 0.222y[8, k]) 
end
@objective(model, Min, -(0.063y[2, k_iter] * y[5, k_iter] - 5.04x[1] - 3.36y[3, k_iter] - 0.035x[2] - 10x[3]))
@constraint(model, [i=1:7], y[i+1] - a[i] >= 0)
@constraint(model, [i=8:14], a[i] - y[i-6] >= 0)
optimize!(model)
x_sol = value.(x)
y_sol = yvec(x_sol)
x, y = x_sol, y_sol
-(0.063y[2] * y[5] - 5.04x[1] - 3.36y[3] - 0.035x[2] - 10x[3])

JuMP isn't really built for expression graphs of varying size.

odow avatar May 17 '23 04:05 odow

Ok, I get the idea. Thanks for your help!

tmigot avatar May 17 '23 13:05 tmigot