pyomo
pyomo copied to clipboard
Solver returns “A feasible solution was not found, so no solution can be loaded”, even though a feasible solution is clearly available.
Summary
I’m running a fairly simple robust optimization problem, to which I know there exists a feasible solution. The separation problem seems to find an optimal solution many times, but for some reason when it reports its results back to the master problem, it’s not able to find a feasible solution.
I changed the solver settings to make the problem easier: “objective_focus = pyros.ObjectiveType.nominal” and “solve_master_globally: False”, and objective = 0, but to no avail. I know that the nominal values for the uncertain parameters are in the uncertainty set, or else PyROS would complain.
I reduced the problem dimensions as much as possible to try to figure out why the solver can’t find this solution. In the code excerpt below, I show that there exists a feasible solution by showing that the constraints are satisfied, but “appsi_ipopt” is unable to find this solution.
Is there anything else I can do to help this problem along? Or is this what I should expect when using IPOPT as the local and global solver using PyROS for a problem like this?
Thanks.
Steps to reproduce the issue
import numpy as np
def create_model():
m = pe.ConcreteModel()
m.del_component('N')
m.add_component('N', pe.Param(mutable=True, within=pe.Reals, initialize=2))
N = pe.value(m.N)
m.del_component('n')
m.add_component('n', pe.Param(mutable=True, within=pe.Reals, initialize=3))
n = pe.value(m.n)
def x_0_init(m, i):
x_0_val = 0.5*np.ones((1,3))
return x_0_val[0,i]
m.del_component('x_0_val')
m.add_component('x_0_val', pe.Param(range(n), mutable=True, initialize=x_0_init))
def v_init(m, i):
v = np.ones((1,n))
return v[0, i]
m.del_component('v')
m.add_component('v', pe.Param(range(n), mutable=True, initialize=v_init)) #
def h_init(m, i):
h = np.ones((1,n))
return h[0, i]
m.del_component('h')
m.add_component('h', pe.Param(range(n), mutable=True, initialize=h_init)) #
m.del_component('T_s')
m.add_component('T_s', pe.Param(mutable=True, within=pe.Reals, initialize=1)) #
def r_init(m, i):
r = np.array([[1,0,0]])
return r[0, i]
m.del_component('r')
m.add_component('r', pe.Param(range(n), mutable=True, initialize=r_init)) # 1 by n
def gamma_init(m, i):
gamma = np.array([1000, 100, 100])
return gamma[i]
m.del_component('gamma')
m.add_component('gamma', pe.Param(range(n), mutable=True, initialize=gamma_init)) # n by n
return m
m=create_model()
q = 1
Q = 2
n = pe.value(m.n)
N = pe.value(m.N)
m.del_component('q')
m.add_component('q', pe.Param(mutable=True, within=pe.NonNegativeIntegers, initialize=q))
m.del_component('Q')
m.add_component('Q', pe.Param(mutable=True, within=pe.NonNegativeIntegers, initialize=Q))
def d_init(m, ix):
if ix == m.q.value:
return 1
return 0
m.del_component('d')
m.add_component('d', pe.Param(range(m.Q.value), mutable=True, initialize=d_init))
l_optimal = np.array([[0,0,0,1,0,0]])
l_value = np.concatenate([np.zeros((1, N*n*(q-1))), l_optimal, np.zeros((1, (Q-q)*n*N))], axis=1)
def l_init(m, ix):
return l_value[0, ix]
m.del_component('l')
m.add_component('l', pe.Param(range(n*Q*N), mutable=True, initialize=l_init))
m.del_component('L')
m.add_component('L', pe.Var(range(n), range(n*Q*N), within=pe.Reals))
n = pe.value(m.n)
N = pe.value(m.N)
L_1 = np.eye(n,n)
L_2 = np.zeros((n,n))
L_2[1, 0] = 1
L_2[2, 1] = 1
L_optimal = np.concatenate([L_1, L_2], axis=1)
L_value = np.concatenate([np.zeros((n, q*n*N)), L_optimal, np.zeros((n, (Q-q-1)*n*N))], axis=1)
def assign_values(model, L_value):
[m, n] = np.shape(L_value)
for i in range(m):
for j in range(n):
model.L[i,j].value = L_value[i,j]
return
assign_values(m, L_value)
m.del_component('c1')
m.add_component('c1', pe.Constraint(expr = 0 <= m.x_0_val[0]*m.d[0]*m.L[0,0] + m.x_0_val[1]*m.d[0]*m.L[0,1] + m.d[0]*m.L[0,2]*m.x_0_val[2] + m.d[1]*m.L[0,6]*m.x_0_val[0] + m.d[1]*m.L[0,7]*m.x_0_val[1] + m.d[1]*m.L[0,8]*m.x_0_val[2] + (m.d[0]*m.l[0] + m.d[1]*m.l[6])))
m.del_component('c2')
m.add_component('c2', pe.Constraint(expr = 0 <= m.x_0_val[0]*m.d[0]*m.L[0,3] + m.x_0_val[1]*m.d[0]*m.L[0,4] + m.d[0]*m.L[0,5]*m.x_0_val[2] + m.d[1]*m.L[0,9]*m.x_0_val[0] + m.d[1]*m.L[0,10]*m.x_0_val[1] + m.d[1]*m.L[0,11]*m.x_0_val[2] + (m.d[0]*m.l[3] + m.d[1]*m.l[9])))
m.del_component('c3')
m.add_component('c3', pe.Constraint(expr = 0 <= m.x_0_val[0]*m.d[0]*m.L[1,0] + m.x_0_val[1]*m.d[0]*m.L[1,1] + m.d[0]*m.L[1,2]*m.x_0_val[2] + m.d[1]*m.L[1,6]*m.x_0_val[0] + m.d[1]*m.L[1,7]*m.x_0_val[1] + m.d[1]*m.L[1,8]*m.x_0_val[2] + (m.d[0]*m.l[1] + m.d[1]*m.l[7])))
m.del_component('c4')
m.add_component('c4', pe.Constraint(expr = 0 <= m.x_0_val[0]*m.d[0]*m.L[1,3] + m.x_0_val[1]*m.d[0]*m.L[1,4] + m.d[0]*m.L[1,5]*m.x_0_val[2] + m.d[1]*m.L[1,9]*m.x_0_val[0] + m.d[1]*m.L[1,10]*m.x_0_val[1] + m.d[1]*m.L[1,11]*m.x_0_val[2] + (m.d[0]*m.l[4] + m.d[1]*m.l[10])))
m.del_component('c5')
m.add_component('c5', pe.Constraint(expr = 0 <= m.x_0_val[0]*m.d[0]*m.L[2,0] + m.x_0_val[1]*m.d[0]*m.L[2,1] + m.d[0]*m.L[2,2]*m.x_0_val[2] + m.d[1]*m.L[2,6]*m.x_0_val[0] + m.d[1]*m.L[2,7]*m.x_0_val[1] + m.d[1]*m.L[2,8]*m.x_0_val[2] + (m.d[0]*m.l[2] + m.d[1]*m.l[8])))
m.del_component('c6')
m.add_component('c6', pe.Constraint(expr = 0 <= m.x_0_val[0]*m.d[0]*m.L[2,3] + m.x_0_val[1]*m.d[0]*m.L[2,4] + m.d[0]*m.L[2,5]*m.x_0_val[2] + m.d[1]*m.L[2,9]*m.x_0_val[0] + m.d[1]*m.L[2,10]*m.x_0_val[1] + m.d[1]*m.L[2,11]*m.x_0_val[2] + (m.d[0]*m.l[5] + m.d[1]*m.l[11])))
m.del_component('c7')
m.add_component('c7', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[0,0] + m.x_0_val[1]*m.d[0]*m.L[0,1] + m.d[0]*m.L[0,2]*m.x_0_val[2] + m.d[1]*m.L[0,6]*m.x_0_val[0] + m.d[1]*m.L[0,7]*m.x_0_val[1] + m.d[1]*m.L[0,8]*m.x_0_val[2] + (m.d[0]*m.l[0] + m.d[1]*m.l[6]) <= m.x_0_val[0]))
m.del_component('c8')
m.add_component('c8', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[0,3] + m.x_0_val[1]*m.d[0]*m.L[0,4] + m.d[0]*m.L[0,5]*m.x_0_val[2] + m.d[1]*m.L[0,9]*m.x_0_val[0] + m.d[1]*m.L[0,10]*m.x_0_val[1] + m.d[1]*m.L[0,11]*m.x_0_val[2] + (m.d[0]*m.l[3] + m.d[1]*m.l[9]) <= m.x_0_val[0] + m.r[0]*m.T_s - m.T_s*(m.d[0]*m.l[0] + m.d[1]*m.l[6] + m.x_0_val[0]*m.d[0]*m.L[0,0] + m.x_0_val[1]*m.d[0]*m.L[0,1] + m.x_0_val[2]*m.d[0]*m.L[0,2] + m.x_0_val[0]*m.d[1]*m.L[0,6] + m.x_0_val[1]*m.d[1]*m.L[0,7] + m.x_0_val[2]*m.d[1]*m.L[0,8])))
m.del_component('c8')
m.add_component('c8', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[0,3] + m.x_0_val[1]*m.d[0]*m.L[0,4] + m.d[0]*m.L[0,5]*m.x_0_val[2] + m.d[1]*m.L[0,9]*m.x_0_val[0] + m.d[1]*m.L[0,10]*m.x_0_val[1] + m.d[1]*m.L[0,11]*m.x_0_val[2] + (m.d[0]*m.l[3] + m.d[1]*m.l[9]) <= m.x_0_val[0] + m.r[0]*m.T_s - m.T_s*(m.d[0]*m.l[0] + m.d[1]*m.l[6] + m.x_0_val[0]*m.d[0]*m.L[0,0] + m.x_0_val[1]*m.d[0]*m.L[0,1] + m.x_0_val[2]*m.d[0]*m.L[0,2] + m.x_0_val[0]*m.d[1]*m.L[0,6] + m.x_0_val[1]*m.d[1]*m.L[0,7] + m.x_0_val[2]*m.d[1]*m.L[0,8])))
m.del_component('c9')
m.add_component('c9', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[1,0] + m.x_0_val[1]*m.d[0]*m.L[1,1] + m.d[0]*m.L[1,2]*m.x_0_val[2] + m.d[1]*m.L[1,6]*m.x_0_val[0] + m.d[1]*m.L[1,7]*m.x_0_val[1] + m.d[1]*m.L[1,8]*m.x_0_val[2] + (m.d[0]*m.l[1] + m.d[1]*m.l[7]) <= m.x_0_val[1]))
m.del_component('c10')
m.add_component('c10', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[1,3] + m.x_0_val[1]*m.d[0]*m.L[1,4] + m.d[0]*m.L[1,5]*m.x_0_val[2] + m.d[1]*m.L[1,9]*m.x_0_val[0] + m.d[1]*m.L[1,10]*m.x_0_val[1] + m.d[1]*m.L[1,11]*m.x_0_val[2] + (m.d[0]*m.l[4] + m.d[1]*m.l[10]) <= m.x_0_val[1] + m.v[1]/m.h[1]*(1 - m.r[1])*(m.v[1]/m.h[1])*m.T_s*(m.x_0_val[0]*m.d[0]*m.L[0,0] + m.x_0_val[1]*m.d[0]*m.L[0,1] + m.d[0]*m.L[0,2]*m.x_0_val[2] + m.d[1]*m.L[0,6]*m.x_0_val[0] + m.d[1]*m.L[0,7]*m.x_0_val[1] + m.d[1]*m.L[0,8]*m.x_0_val[2] + (m.d[0]*m.l[0] + m.d[1]*m.l[6])) - m.T_s*(m.d[0]*m.l[1] + m.d[1]*m.l[7] + m.x_0_val[0]*m.d[0]*m.L[1,0] + m.x_0_val[1]*m.d[0]*m.L[1,1] + m.x_0_val[2]*m.d[0]*m.L[1,2] + m.x_0_val[0]*m.d[1]*m.L[1,6] + m.x_0_val[1]*m.d[1]*m.L[1,7] + m.x_0_val[2]*m.d[1]*m.L[1,8])))
m.del_component('c11')
m.add_component('c11', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[2,0] + m.x_0_val[1]*m.d[0]*m.L[2,1] + m.d[0]*m.L[2,2]*m.x_0_val[2] + m.d[1]*m.L[2,6]*m.x_0_val[0] + m.d[1]*m.L[2,7]*m.x_0_val[1] + m.d[1]*m.L[2,8]*m.x_0_val[2] + (m.d[0]*m.l[2] + m.d[1]*m.l[8]) <= m.x_0_val[2]))
m.del_component('c12')
m.add_component('c12', pe.Constraint(expr = m.x_0_val[0]*m.d[0]*m.L[2,3] + m.x_0_val[1]*m.d[0]*m.L[2,4] + m.d[0]*m.L[2,5]*m.x_0_val[2] + m.d[1]*m.L[2,9]*m.x_0_val[0] + m.d[1]*m.L[2,10]*m.x_0_val[1] + m.d[1]*m.L[2,11]*m.x_0_val[2] + (m.d[0]*m.l[5] + m.d[1]*m.l[11]) <= m.x_0_val[2] + m.v[2]/m.h[2]*(1 - m.r[2])*(m.v[2]/m.h[2])*m.T_s*(m.x_0_val[0]*m.d[0]*m.L[1,0] + m.x_0_val[1]*m.d[0]*m.L[1,1] + m.d[0]*m.L[1,2]*m.x_0_val[2] + m.d[1]*m.L[1,6]*m.x_0_val[0] + m.d[1]*m.L[1,7]*m.x_0_val[1] + m.d[1]*m.L[1,8]*m.x_0_val[2] + (m.d[0]*m.l[1] + m.d[1]*m.l[7])) - m.T_s*(m.d[0]*m.l[2] + m.d[1]*m.l[8] + m.x_0_val[0]*m.d[0]*m.L[2,0] + m.x_0_val[1]*m.d[0]*m.L[2,1] + m.x_0_val[2]*m.d[0]*m.L[2,2] + m.x_0_val[0]*m.d[1]*m.L[2,6] + m.x_0_val[1]*m.d[1]*m.L[2,7] + m.x_0_val[2]*m.d[1]*m.L[2,8])))
# ===== show that constraints are satisfied for the above values for the decision variable, and fixed parameter "l"
def all_satisfied():
Truth_array = []
for i in range(n):
for k in range(N):
eval_str = 'Truth_array.append(pe.value(m.c'+str((i*2)+k+1)+'.expr))'
eval(eval_str)
if all(Truth_array):
return True
else:
return False
if all_satisfied():
print('feasible')
else:
print('infeasible')
# === Specify the objective function ===
m.del_component('objective_function')
m.add_component('objective_fn_expression', pe.Objective(expr = 0, sense = pe.minimize))
import pyomo.contrib.pyros as pyros
# === Instantiate the PyROS solver object ===
pyros_solver = pe.SolverFactory("pyros")
gamma = [pe.value(m.gamma[0]), pe.value(m.gamma[1]), pe.value(m.gamma[2])]
h = [pe.value(m.h[0]), pe.value(m.h[1]), pe.value(m.h[2])]
max_densities = np.array([gamma_h[0]*gamma_h[1] for gamma_h in zip(gamma, h)])
bounds_x_0_val = list(zip([0]*m.n.value, max_densities))
bounds_d = [(0,1)]*m.Q.value
# === Specify which parameters are uncertain ===
uncertain_parameters = [m.d, m.x_0_val]
# === Construct the desirable uncertainty set ===
uncertainty_set = pyros.BoxSet(bounds = bounds_x_0_val + bounds_d)
# === Designate local and global NLP solvers ===
ipopt_solver = pe.SolverFactory('appsi_ipopt')
local_solver = ipopt_solver
global_solver = ipopt_solver
# === Designate which variables correspond to first- and second-stage degrees of freedom ===
first_stage_variables = [m.L]
second_stage_variables = []
# The remaining variables are implicitly designated to be state variables
# === Call PyROS to solve the robust optimization problem ===
results = pyros_solver.solve(model = m,
first_stage_variables = first_stage_variables,
second_stage_variables = second_stage_variables,
uncertain_params = uncertain_parameters,
uncertainty_set = uncertainty_set,
local_solver = local_solver,
global_solver = global_solver,
tee = True,
options = {
"objective_focus": pyros.ObjectiveType.nominal,
"solve_master_globally": False,
"load_solution":False
})
# === Query results ===
time = results.time
iterations = results.iterations
termination_condition = results.pyros_termination_condition
objective = results.final_objective_value
# === Print some results ===
single_stage_final_objective = round(objective, -1)
print("Final objective value: %s" % single_stage_final_objective)
@makansij:
- Just to be clear, have you confirmed that your solution is robust feasible (i.e. feasible for your model constraints under all parameter realizations in the uncertainty set you have constructed, and not just the nominal realization), and/or more broadly, that your model is robust feasible?
- PyROS solves multiple separation subproblems per iteration (one problem for each inequality constraint), to determine whether there exist any points in your uncertainty set under which that inequality constraint is violated by the first-stage variable solution found in the most recently solved master problem. All separation problems are required to be solved to an optimality condition. (Otherwise, PyROS will terminate with a
subsolver_error
condition at the point it detects a separation problem cannot be solved to optimality.) - By default, all separation problems are first solved to local optimality using the
local_solver
passed to PyROS. If a violated constraint is found, then PyROS updates the master problem and a new iteration begins. Otherwise, PyROS will solve all separation problems to global optimality using theglobal_solver
. If a violated constraint is found, then PyROS updates the master problem and a new iteration begins; otherwise, PyROS terminates with arobust_feasible
orrobust_optimal
condition. (Passtee=False
and the PyROS iteration/termination progress becomes clearer.) Are you sureappsci_ipopt
is able to solve NLPs to global optimality (or more precisely, to anOptimal
orgloballyOptimal
termination condition)?
@shermanjasonaf Thanks a lot for clarifying. I'll confirm and get back to you.
@shermanjasonaf Thanks for the detailed information. Responses to each:
-
I don’t have a formal proof showing that my solution is robust feasible. But, since the uncertainty set is a box constraint, I did a simple and very granular grid search along all values inside the box, and the values above are feasible against all of them. Am I missing an easier way to check this?
-
If I understand you correctly, this manual check in 1) above isn’t necessary given the procedure you described in 2). If the first-stage variable violated any of the inequalities for a realization in the uncertainty set, it would have thrown a “subsolver_error” - correct?
-
No, I’m not sure that appsi_ipopt is able to solve NLPs to global optimality, hence the question regarding “is this what I should expect when using IPOPT?”. And I’m willing to try other NLP solvers such as COUENNE, per our conversation here: https://github.com/Pyomo/pyomo/issues/2428. Any others besides ipopt and couenne that integrate with PyROS that might solve to global optimality?
@makansij
-
If you know in advance (via formal proof) that your model solution (and therefore your RO model) is robust feasible, then we would hope that PyROS successfully either (1) certifies this solution (or possibly some nearby locally optimal solution) is robust feasible in a single iteration, or (2) otherwise converges to a robust feasible solution within a few iterations. A grid search is one of the simpler checks I can think of, though it may not be sufficient for a formal proof, since the box set is continuous. For many of the models I've worked with under box uncertainty, violating realizations are most likely to be found along the set boundary (such as at one of the vertices)---but this is model-dependent, of course.
-
The purpose of each separation problem is to determine a/the realization from the uncertainty set which maximizes the violation of an inequality constraint taken from your original model, subject to the model's equality constraints. (Hence, each PyROS iteration consists in solving as many separation subproblems as there are inequality constraints in your model.) The
subsolver_error
condition is returned only if a subproblem cannot be solved to an acceptable optimality condition. It is very much possible---and indeed expected if the most recent master problem solution is not robust feasible---that a violated inequality constraint (i.e. a separation subproblem with an optimal value greater than a nonnegative relative tolerance) is found during the separation phase. -
Can you share what is printed to your console when you attempt to run the script (with
tee=False
, and any messages printed at the end if you run withtee=True
)? COUENNE is the only open source global solver that I've worked with previously. You could opt out of solving separation problems to global optimality altogether by passing the optionbypass_global_separation=True
. But in this case, we cannot guarantee that any solution returned by PyROS is actually robust feasible---only heuristically so.
Sure. Here's the traceback for when tee=True
:
EXIT: Converged to a point of local infeasibility. Problem may be infeasible.
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-3-4acf0e4bb7aa> in <module>
211 "objective_focus": pyros.ObjectiveType.nominal,
212 "solve_master_globally": False,
--> 213 "load_solution":False
214 })
215
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/pyros.py in solve(self, model, first_stage_variables, second_stage_variables, uncertain_params, uncertainty_set, local_solver, global_solver, **kwds)
431
432 # === Solve and load solution into model
--> 433 pyros_soln, final_iter_separation_solns = ROSolver_iterative_solve(model_data, config)
434
435
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/pyros_algorithm_methods.py in ROSolver_iterative_solve(model_data, config)
161 # === Solve Master Problem
162 config.progress_logger.info("PyROS working on iteration %s..." % k)
--> 163 master_soln = master_problem_methods.solve_master(model_data=master_data, config=config)
164 #config.progress_logger.info("Done solving Master Problem!")
165 master_soln.master_problem_subsolver_statuses = []
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/master_problem_methods.py in solve_master(model_data, config)
523
524 return solver_call_master(model_data=model_data, config=config, solver=solver,
--> 525 solve_data=master_soln)
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/master_problem_methods.py in solver_call_master(model_data, config, solver, solve_data)
466 solver = backup_solvers.pop(0)
467 try:
--> 468 results = solver.solve(nlp_model, tee=config.tee)
469 except ValueError as err:
470 if 'Cannot load a SolverResults object with bad status: error' in str(err):
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/base.py in solve(self, model, tee, load_solutions, logfile, solnfile, timelimit, report_timing, solver_io, suffixes, options, keepfiles, symbolic_solver_labels)
1234 self.options = options
1235
-> 1236 results: Results = super(LegacySolverInterface, self).solve(model)
1237
1238 legacy_results = LegacySolverResults()
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/solvers/ipopt.py in solve(self, model, timer)
264 self._writer.write(model, self._filename+'.nl', timer=timer)
265 timer.stop('write nl file')
--> 266 res = self._apply_solver(timer)
267 self._last_results_object = res
268 if self.config.report_timing:
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/solvers/ipopt.py in _apply_solver(self, timer)
432 else:
433 timer.start('parse solution')
--> 434 results = self._parse_sol()
435 timer.stop('parse solution')
436
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/solvers/ipopt.py in _parse_sol(self)
371 results.best_feasible_objective = value(obj_expr_evaluated)
372 elif self.config.load_solution:
--> 373 raise RuntimeError('A feasible solution was not found, so no solution can be loaded.'
374 'Please set opt.config.load_solution=False and check '
375 'results.termination_condition and '
RuntimeError: A feasible solution was not found, so no solution can be loaded.Please set opt.config.load_solution=False and check results.termination_condition and resutls.best_feasible_objective before loading a solution.
Now here's the entire output of the code excerpt I posted above, for tee=False
:
l_value is [[0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]]
L_value is [[0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 1. 0. 1. 0. 0.]
[0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 1. 0.]]
feasible
===========================================================================================
PyROS: Pyomo Robust Optimization Solver v.1.1.1
Developed by Natalie M. Isenberg (1), John D. Siirola (2), Chrysanthos E. Gounaris (1)
(1) Carnegie Mellon University, Department of Chemical Engineering
(2) Sandia National Laboratories, Center for Computing Research
The developers gratefully acknowledge support from the U.S. Department of Energy's
Institute for the Design of Advanced Energy Systems (IDAES)
===========================================================================================
======================================== DISCLAIMER =======================================
PyROS is still under development.
Please provide feedback and/or report any issues by opening a Pyomo ticket.
===========================================================================================
INFO: PyROS working on iteration 0...
INFO: PyROS working on iteration 1...
INFO: PyROS working on iteration 2...
INFO: PyROS working on iteration 3...
INFO: PyROS working on iteration 4...
INFO: PyROS working on iteration 5...
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-3-dda1af16af8f> in <module>
211 "objective_focus": pyros.ObjectiveType.nominal,
212 "solve_master_globally": False,
--> 213 "load_solution":False
214 })
215
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/pyros.py in solve(self, model, first_stage_variables, second_stage_variables, uncertain_params, uncertainty_set, local_solver, global_solver, **kwds)
431
432 # === Solve and load solution into model
--> 433 pyros_soln, final_iter_separation_solns = ROSolver_iterative_solve(model_data, config)
434
435
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/pyros_algorithm_methods.py in ROSolver_iterative_solve(model_data, config)
161 # === Solve Master Problem
162 config.progress_logger.info("PyROS working on iteration %s..." % k)
--> 163 master_soln = master_problem_methods.solve_master(model_data=master_data, config=config)
164 #config.progress_logger.info("Done solving Master Problem!")
165 master_soln.master_problem_subsolver_statuses = []
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/master_problem_methods.py in solve_master(model_data, config)
523
524 return solver_call_master(model_data=model_data, config=config, solver=solver,
--> 525 solve_data=master_soln)
/usr/local/lib/python3.7/site-packages/pyomo/contrib/pyros/master_problem_methods.py in solver_call_master(model_data, config, solver, solve_data)
466 solver = backup_solvers.pop(0)
467 try:
--> 468 results = solver.solve(nlp_model, tee=config.tee)
469 except ValueError as err:
470 if 'Cannot load a SolverResults object with bad status: error' in str(err):
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/base.py in solve(self, model, tee, load_solutions, logfile, solnfile, timelimit, report_timing, solver_io, suffixes, options, keepfiles, symbolic_solver_labels)
1234 self.options = options
1235
-> 1236 results: Results = super(LegacySolverInterface, self).solve(model)
1237
1238 legacy_results = LegacySolverResults()
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/solvers/ipopt.py in solve(self, model, timer)
264 self._writer.write(model, self._filename+'.nl', timer=timer)
265 timer.stop('write nl file')
--> 266 res = self._apply_solver(timer)
267 self._last_results_object = res
268 if self.config.report_timing:
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/solvers/ipopt.py in _apply_solver(self, timer)
432 else:
433 timer.start('parse solution')
--> 434 results = self._parse_sol()
435 timer.stop('parse solution')
436
/usr/local/lib/python3.7/site-packages/pyomo/contrib/appsi/solvers/ipopt.py in _parse_sol(self)
371 results.best_feasible_objective = value(obj_expr_evaluated)
372 elif self.config.load_solution:
--> 373 raise RuntimeError('A feasible solution was not found, so no solution can be loaded.'
374 'Please set opt.config.load_solution=False and check '
375 'results.termination_condition and '
RuntimeError: A feasible solution was not found, so no solution can be loaded.Please set opt.config.load_solution=False and check results.termination_condition and resutls.best_feasible_objective before loading a solution.
Looks like your local solver could not find a feasible solution to the master problem of iteration 5. It's possible that this master problem is indeed infeasible (and hence your model is robust infeasible). However, your local solver doesn't guarantee that. Our protocol in the event of encountering an infeasible (or locally infeasible) master problem is to return a robust_infeasible
termination condition.
So, we will need to improve the routine for calling the subsolver(s) within PyROS to handle this termination more appropriately. Here's what I envision:
- invoking the subsolver with
load_solutions=False
(and then checking the subsolver termination statuses afterwards to determine how to proceed) - if the subsolver returns an
infeasible
termination condition, then PyROS returns arobust_infeasible
termination condition (and in the case that the master problem is solved locally, possibly logs a caveat that this is due to a local infeasibility). - returning (as an attribute of the PyROS results object), or otherwise logging, the uncertain parameter realizations under which the original model constraints were explicitly enforced for the master problem.
@shermanjasonaf, thanks again for your help. I want to try to understand:
-
You’ll see in my code excerpt above, I’m already setting
load_solution=False
. Now, I assume this is to ensure that the next set of iterations of the sub-solver start with a fresh (perhaps random?) initial point for solving? -
I tried setting “bypass_global_separation=True”, but this didn’t change anything.
-
The GRCS algorithm (and hence PyROS) does not guarantee to return a robust feasible solution if one exists – it just makes it easier to identify one? And then should the user always be ready with a prof that a robust feasible solution exists?
@makansij
- As per the documentation, the option
load_solution
only dictates whether PyROS should load any solutions found by the end of the PyROS algorithm. Moreover, ifload_solution
is set toTrue
, then a solution is loaded only when PyROS terminates with arobust_feasible
orrobust_optimal
status. This option doesn't affect the subproblem initializations; that is handled automatically by PyROS. - Since the traceback you shared comes from attempting to solve a master problem, it makes sense that the outcome is unchanged when you run with
bypass_global_separation=True
(as it seems PyROS did not make use of global separation in previous iterations originally). The issue is not with the separation problems, but that a master problem may be---but is not necessarily---infeasible (and therefore your model may be robust infeasible), but PyROS does not properly handle the subsolver termination gracefully to return this result. Moreover, you may need to pass a global solver and setsolve_master_globally=True
for a true certificate of robust infeasibility. - Unless you set
bypass_global_separation=True
, any solution returned by PyROS with arobust_optimal
orrobust_feasible
status is proven to be robust feasible (up to the relative tolerancerobust_feasibility_tolerance
). The proof is that PyROS has solved all separation problems to global optimality without finding a violating realization. You do not need to be ready with a proof of robust feasibility/infeasibility of your model in advance. If you do have a proof, then it may inform your expectation(s) of the PyROS termination outcome.
@shermanjasonaf I have one more question regarding the robust feasible solution returned by PyROS: If the solution is verified robust feasible by PyROS, will the first stage variables have values for the optimal solution? (in other words, the first stage variables should not be “NaN”s in this case – right?)
Are you actually referring to NaN (as in float("nan")
or numpy.nan
), or do you mean None
?
I don't expect a first-stage variable value in the solution returned by PyROS to be NaN---unless the first-stage variable was initialized to NaN when you passed the model to PyROS and either:
- you ran PyROS with
load_solution=False
- the first-stage variable does not participate in any of the model objective or constraint expressions (either because it is 'canceled out' of the expressions algebraically or not explicitly declared to be part of any of the expressions).
Hi @shermanjasonaf, I ran some simulations on my own. I want to confirm some things with what you said above:
-
If PyROS returns
No feasible solution found
andsolve_master_globally = False
, then it’s possible that the master problem actually is robust feasible, but it is locally robust infeasible. -
If PyROS returns
No feasible solution found
andsolve_master_globally = True
, then the master problem is robust infeasible.
However, if PyROS returns No feasible solution found
in either of the above cases then there seems to be no easy way to determine the uncertain parameter realizations for which no feasible solution is found. (This would be a future improvement to PyROS, you seem to be suggesting).
On the other hand, if PyROS returns a robust feasible solution, then that’s sufficient to guarantee that the master problem is robust feasible.
If we are in case 1), then is there a strategy to change how the problem is solved? Would you recommend changing the initial conditions? Or maybe using a different solver? Or changing robust_feasibility_tolerance
?
Hi @makansij:
- It is possible that the model is robust feasible or robust infeasible. You could say that the model is "locally robust infeasible", but this concept is currently not mathematically well defined for our purposes.
- The model is robust infeasible, provided the global subsolver has certified global infeasibility of the master problem.
In either case, we should handle the subsolver termination gracefully and return a robust_infeasible
termination condition as an attribute of the solver results. This requires improved exception and termination handling within the PyROS subsolver call routines.
Indeed, it would be helpful to have some idea of the uncertain parameter realization(s) for which no feasible second-stage adjustment to some fixed choice of first-stage variable values exists. So a potential and worthwhile improvement would likely involve the addition of:
- the uncertain parameter scenarios against which the model constraints are enforced in the most recently solved master problem
- the solution to the last master problem solved to optimality
to the results object returned by PyROS.
You could consider changing either the initial point, subsolver(s), or robust feasibility tolerance. You may also consider increasing the decision rule order (decision_rule_order
) to increase the second-stage adaptivity. Any of these changes may result in a change in the trajectory of the iterates and/or the final PyROS outcome.
@shermanjasonaf I constructed a minimal example, a scalar robust optimization problem, to illustrate what I think is happening on a larger scale. Instead of extending this thread, I started a new one here: https://github.com/Pyomo/pyomo/issues/2501