casadi
casadi copied to clipboard
peculiar NLP sensitivity analysis cases
Don't get why this fails:
from casadi import *
x = MX.sym("x")
p = MX.sym("p")
y = MX.sym("y")
solver = nlpsol("solver","sqpmethod",{"x":vertcat(x,y),"p":p,"f":(x-p)**2,"g":vertcat(y-x)})
print(solver(p=0,lbg=0,ubg=0))
f = Function("J",[p],[jacobian(solver(p=p,lbg=0,ubg=0)["x"],p)])
print(f(0))
This occurs naturally in the last gap-closing constraint of a multiple-shooting code with an objective depending only on a subset of states.
Debugging:
import casadi.*
opti = Opti();
x = opti.variable();
p = opti.parameter();
y = opti.variable();
opti.minimize((x-p)^2)
opti.subject_to(y==x)
opti.solver('ipopt')
opti.set_value(p, 0)
sol = opti.solve();
sol.value(x)
sol.value(y)
sol.value(opti.lam_g)
% Lagrange function
lag = opti.f+opti.lam_g'*opti.g;
% Lagrange Hessian
H = sol.value(hessian(lag,opti.x))
% Constraint Jacobian
J = sol.value(jacobian(opti.g,opti.x))
% Nullspace of J
Z = null(full(J));
% Reduced Hessian
Hr = Z'*H*Z % full rank
ans =
0.0000e+000
ans =
0.0000e+000
ans =
0.0000e+000
H =
(1,1) 2.0000e+000
J =
-1.0000e+000 1.0000e+000
Hr =
1.0000e+000
bIx:
[0, 0]
bIg:
[0]
H:
[[2, 00, -1],
[00, 0, 1],
[-0, 0, -1]]
Singularity detected: Rank 2<3
First singular R entry: 0<1e-12, corresponding to row 1
Linear combination of columns:
[-0, 1, 0]
I'm running into what I believe is the same problem. Minimal example in Python (casadi 3.5.3):
from casadi import *
x = MX.sym('x')
y = MX.sym('y')
nlp = {'x' : x,
'p' : y,
'f' : x*y}
solver = nlpsol('opt', 'ipopt', nlp)
solution = solver(x0=0, p=y)
jac = Function('j', [y], [jacobian(solution['x'], y)])
jac(0)
This gives:
RuntimeError: .../casadi/core/function_internal.hpp:1229: Evaluation failed
Hi,
I am running into the same issue as @jgillis
Running the example on CasADi 3.6.2.
In principle it should be sufficient that the reduced Hessian is full-rank.
I also tried adding the options to the solver {"qpsol": 'qrqp', 'convexify_strategy': 'regularize'}
I would also be interested in how to get the additional information in the last part of Joris' post bIx: ...
.
If anyone has an idea how to mitigate this, I would highly appreciate it!
Moreover, I looked into the example of @Humhu with slight modification, as below. I think the issue here is different and should maybe be tracked separately.
def sens_humhu():
x = ca.MX.sym('x')
p = ca.MX.sym('p')
nlp = {'x' : x,
'p' : p,
'f' : (x-p)**2 + 1e-3*p}
# opts = {'ipopt' : {'print_level' : 5, 'fixed_variable_treatment' : 'make_constraint'}}
# solver = ca.nlpsol('opt', 'ipopt', nlp, opts)
solver = ca.nlpsol('opt', 'sqpmethod', nlp)
solution = solver(x0=0, p=p, lbx = 0.0, ubx=1.0)
jac_x = ca.Function('j', [p], [ca.jacobian(solution['x'], p)])
jac_x_val = jac_x(0.5)
jac_f = ca.Function('j', [p], [ca.jacobian(solution['f'], p)])
jac_f_val = jac_f(0.5)
print(f"{jac_x_val=}")
print(f"{jac_f_val=}")
With SQP this returns the correct values:
jac_x_val=DM(1)
jac_f_val=DM(0.001)
while when using IPOPT, I get:
jac_x_val=DM(0)
jac_f_val=DM(0.001)
i.e. correct values for cost sensitivity and wrong values for solution sensitivity. Is this expected? Would installing SIPOPT fix this?
i.e. correct values for cost sensitivity and wrong values for solution sensitivity. Is this expected? Would installing SIPOPT fix this?
No idea what is actually going wrong here. Not sure if there is any way to find out without diving into the code. Maybe you can solve this with SIPOPT, but probably not via CasADi in that case. I'm currently not planning to look at NLP sensitivities for the upcoming release, so it's for CasADi 3.8 at the earliest for me, unless something changes.
Hi
I also tried to use sensitivities to "transform" an equality-constrained problem into an unconstrained one, because although its Jacobian is sparse, its Hessian isn't, yielding an enormous overhead in the coloring step (see #3424).
I encountered the same problems as reported above, but made one additional observation: I basically run the following:
constraints = @(x1,x2) ...
x0_inner = ...
x0_outer = ...
x_inner = casadi.MX.sym('x_inner', length(x0_inner), 1);
x_outer = casadi.MX.sym('x_outer', length(x0_outer), 1);
nlp_inner = struct('x', x_inner, 'p', x_outer, 'g', constraints(x_inner, x_outer));
S_inner = casadi.nlpsol('S_inner', 'sqpmethod', nlp_inner);
output_inner = S_inner('x0', x0_inner, 'p', x_outer, 'lbg', 0, 'ubg', 0);
output_inner_jac = casadi.Function('output_inner_jac',{x_outer},{jacobian(output_inner.x, x_outer)});
output_inner_jac(x0_outer)
Running this once will give me a Jacobian; running the final statement, i.e., output_inner_jac(x0_outer)
, again will result in an error:
Error using casadi.Function/call
Error in Function::call for 'output_inner_jac' [MXFunction] at .../casadi/core/function.cpp:330:
.../casadi/core/function_internal.hpp:1616: Evaluation failed
For any other solver that I tried instead of sqpmethod (in particular KNITRO and IPOPT), it will fail immediately.
Maybe this gives a direction for locating the problem, @jaeandersson?
Update: the behaviour described below is independent of the call to casadi.jacobian
; rather, it happens even if I directly call my solver function object S_inner
(but it affects only sqpmethod and qpOASES).
One more observation: calling the Jacobian function a second time will actually result in the solver being called again and outputting its message. However, what I find surprising is that qpOASES seems to carry its state across the calls:
>>output_inner_jac(x0_outer)
qpOASES -- An Implementation of the Online Active Set Strategy.
Copyright (C) 2007-2015 by Hans Joachim Ferreau, Andreas Potschka,
Christian Kirches et al. All rights reserved.
...
This is casadi::Sqpmethod.
Using exact Hessian
Number of variables: 180
Number of constraints: 180
Number of nonzeros in constraint Jacobian: 1062
Number of nonzeros in Lagrangian Hessian: 536
iter objective inf_pr inf_du ||d|| lg(rg) ls info
+ 0 0.000000e+00 4.26e-01 0.00e+00 0.00e+00 - 0 -
!#################### qpOASES -- QP NO. 1 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 4.163958e-07 | ADD CON 0 | 179 | 1
1 | 6.625230e-10 | ADD CON 1 | 178 | 2
2 | 1.314887e-09 | ADD CON 2 | 177 | 3
...
!#################### qpOASES -- QP NO. 7 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 1.000000e+00 | QP SOLVED | 0 | 180
7 0.000000e+00 5.23e-10 1.07e-20 3.07e-04 - 1 -
+MESSAGE(sqpmethod): Convergence achieved after 7 iterations
S_inner : t_proc (avg) t_wall (avg) n_eval
QP | 173.02ms ( 24.72ms) 86.07ms ( 12.30ms) 7
...
total | 24.72ms ( 24.72ms) 10.46ms ( 10.46ms) 1
ans =
[[8.37594e-17, 0, -3.69656e-17, 0],
[-0.0205365, 0, -0.034595, 0],
[-0.041071, 0, -0.0677863, 0],
[-0.0616019, 0, -0.0995744, 0],
[-0.0821272, 0, -0.12996, 0],
[-0.102645, 0, -0.158943, 0],
[-0.123151, 0, -0.186526, 0],
...
>> output_inner_jac(x0_outer)
iter objective inf_pr inf_du ||d|| lg(rg) ls info
+ 0 0.000000e+00 4.26e-01 0.00e+00 0.00e+00 - 0 -
!#################### qpOASES -- QP NO. 8 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 1.000000e+00 | LP SOLVED | 0 | 180
1 0.000000e+00 5.63e+00 2.94e-35 1.71e+01 - 3F -
...
!#################### qpOASES -- QP NO. 14 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 1.000000e+00 | LP SOLVED | 0 | 180
7 0.000000e+00 5.22e-10 4.16e-126 3.07e-04 - 1 -
+MESSAGE(sqpmethod): Convergence achieved after 7 iterations
...
total | 20.25ms ( 20.25ms) 8.74ms ( 8.74ms) 1
Error using casadi.Function/call
Error in Function::call for 'output_inner_jac' [MXFunction] at .../casadi/core/function.cpp:330:
.../casadi/core/function_internal.hpp:1616: Evaluation failed
In particular, this will lead to a slightly different trajectory that the solver takes in the second run (the one that—although converging—still ends up with an error), although starting from the same values. Also note that the "welcome message" of neither solver is shown again for the second call (not deleted).