safe-control-gym
safe-control-gym copied to clipboard
Failed to run create_fig6.sh
Hi, I try to run create_fig6.sh, but it failed. Could you help me with this problem? Thanks for your help!
error
pybullet build time: May 20 2022 19:44:17
UserWarning: WARN: Box bound precision lowered by casting to float32
logger.warn(
Traceback (most recent call last):
File "./gp_mpc_experiment.py", line 197, in
info
@1=vertcat(x, x_dot, z, z_dot, theta, theta_dot), @2=vertcat(T1, T2), (mac(mac((0.5*(@1-Xr)'),Q,zeros(1x6)),(@1-Xr),0)+mac(mac((0.5*(@2-Ur)'),R,zeros(1x2)),(@2-Ur),0)) @1=vertcat(x, x_dot, z, z_dot, theta, theta_dot), @2=vertcat(T1, T2), (mac(mac((0.5*(@1-Xr)'),Q,zeros(1x6)),(@1-Xr),0)+mac(mac((0.5*(@2-Ur)'),R,zeros(1x2)),(@2-Ur),0)) @1=vertcat(x, x_dot, z, z_dot, theta, theta_dot), @2=vertcat(T1, T2), (mac(mac((0.5*(@1-Xr)'),Q,zeros(1x6)),(@1-Xr),0)+mac(mac((0.5*(@2-Ur)'),R,zeros(1x2)),(@2-Ur),0)) @1=vertcat(x, x_dot, z, z_dot, theta, theta_dot), @2=vertcat(T1, T2), (mac(mac((0.5*(@1-Xr)'),Q,zeros(1x6)),(@1-Xr),0)+mac(mac((0.5*(@2-Ur)'),R,zeros(1x2)),(@2-Ur),0)) @1=vertcat(x, x_dot, z, z_dot, theta, theta_dot), @2=vertcat(T1, T2), (mac(mac((0.5*(@1-Xr)'),Q,zeros(1x6)),(@1-Xr),0)+mac(mac((0.5*(@2-Ur)'),R,zeros(1x2)),(@2-Ur),0)) Init State: [-1. 0. 0. 0. -0. 0.] iter objective inf_pr inf_du ||d|| lg(rg) ls 0 5.504332e+01 1.00e+00 1.00e+01 0.00e+00 - 0
#################### qpOASES -- QP NO. 1 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 6.164476e-08 | REM BND 62 | 85 | 0
1 | 2.208861e-11 | REM BND 60 | 84 | 0
2 | 4.431082e-11 | REM BND 56 | 83 | 0
3 | 2.233967e-11 | REM BND 54 | 82 | 0
4 | 4.481644e-11 | REM BND 50 | 81 | 0
5 | 2.259504e-11 | REM BND 48 | 80 | 0
6 | 4.533077e-11 | REM BND 44 | 79 | 0
7 | 2.285482e-11 | REM BND 42 | 78 | 0
8 | 4.585399e-11 | REM BND 38 | 77 | 0
9 | 2.311911e-11 | REM BND 36 | 76 | 0
10 | 4.638633e-11 | REM BND 32 | 75 | 0
11 | 2.338800e-11 | REM BND 30 | 74 | 0
12 | 4.692799e-11 | REM BND 26 | 73 | 0
13 | 2.366162e-11 | REM BND 24 | 72 | 0
14 | 4.747919e-11 | REM BND 20 | 71 | 0
15 | 2.394007e-11 | REM BND 18 | 70 | 0
16 | 4.804016e-11 | REM BND 14 | 69 | 0
17 | 2.422347e-11 | REM BND 12 | 68 | 0
18 | 4.861113e-11 | REM BND 8 | 67 | 0
19 | 2.451194e-11 | REM BND 6 | 66 | 0
20 | 4.919233e-11 | REM BND 2 | 65 | 0
21 | 2.480558e-11 | REM BND 0 | 64 | 0
22 | 9.365151e-09 | ADD CON 3 | 63 | 1
23 | 1.295850e-09 | ADD CON 5 | 62 | 2
24 | 7.426724e-10 | ADD CON 26 | 61 | 3
25 | 1.332632e-09 | ADD CON 28 | 60 | 4
26 | 6.982862e-10 | ADD CON 49 | 59 | 5
27 | 1.369280e-09 | ADD CON 51 | 58 | 6
28 | 6.540761e-10 | ADD CON 72 | 57 | 7
29 | 1.405795e-09 | ADD CON 74 | 56 | 8
30 | 6.100412e-10 | ADD CON 95 | 55 | 9
31 | 1.442178e-09 | ADD CON 97 | 54 | 10
32 | 5.661805e-10 | ADD CON 118 | 53 | 11
33 | 1.478429e-09 | ADD CON 120 | 52 | 12
34 | 5.224929e-10 | ADD CON 141 | 51 | 13
35 | 1.514549e-09 | ADD CON 143 | 50 | 14
36 | 4.789775e-10 | ADD CON 164 | 49 | 15
37 | 1.550540e-09 | ADD CON 166 | 48 | 16
38 | 4.356331e-10 | ADD CON 187 | 47 | 17
39 | 1.586400e-09 | ADD CON 189 | 46 | 18
40 | 3.924588e-10 | ADD CON 210 | 45 | 19
41 | 1.622132e-09 | ADD CON 212 | 44 | 20
42 | 2.688693e-07 | ADD CON 1 | 44 | 20
43 | 9.136291e-10 | REM BND 3 | 43 | 20
44 | 7.837492e-09 | ADD CON 24 | 43 | 20
45 | 3.278552e-09 | ADD CON 5 | 42 | 21
46 | 5.253162e-09 | REM BND 4 | 41 | 21
47 | 1.157731e-10 | ADD CON 47 | 41 | 21
48 | 3.425317e-09 | ADD CON 10 | 41 | 22
49 | 2.471019e-10 | ADD CON 28 | 40 | 23
50 | 4.873264e-09 | ADD CON 70 | 40 | 23
51 | 4.061019e-09 | ADD CON 51 | 39 | 24
52 | 4.384648e-09 | ADD CON 93 | 39 | 24
53 | 4.444456e-09 | ADD CON 74 | 38 | 25
54 | 3.902940e-09 | ADD CON 116 | 38 | 25
55 | 4.822829e-09 | ADD CON 97 | 37 | 26
56 | 3.428001e-09 | ADD CON 139 | 37 | 26
57 | 5.196239e-09 | ADD CON 120 | 36 | 27
58 | 2.959692e-09 | ADD CON 162 | 36 | 27
59 | 5.564779e-09 | ADD CON 143 | 35 | 28
60 | 2.497879e-09 | ADD CON 185 | 35 | 28
61 | 5.928545e-09 | ADD CON 166 | 34 | 29
62 | 2.042433e-09 | ADD CON 208 | 34 | 29
63 | 6.287626e-09 | ADD CON 189 | 33 | 30
64 | 8.235337e-09 | ADD CON 212 | 32 | 31
65 | 1.275344e-08 | ADD CON 234 | 31 | 32
66 | 3.401954e-08 | ADD CON 248 | 30 | 33
67 | 6.054781e-09 | REM BND 7 | 29 | 33
68 | 1.945595e-08 | ADD CON 33 | 29 | 34
69 | 1.874176e-11 | ADD CON 244 | 28 | 35
70 | 1.418504e-08 | ADD CON 56 | 28 | 35
71 | 1.368721e-08 | ADD CON 5 | 28 | 35
72 | 7.644667e-09 | REM BND 19 | 27 | 35
73 | 2.055939e-08 | ADD CON 79 | 27 | 36
74 | 9.512611e-09 | REM BND 25 | 26 | 36
75 | 2.006088e-08 | REM BND 59 | 25 | 36
76 | 2.039106e-10 | REM BND 53 | 24 | 36
77 | 2.061283e-10 | REM BND 47 | 23 | 36
78 | 2.084984e-10 | REM BND 41 | 22 | 36
79 | 4.756914e-10 | REM BND 35 | 21 | 36
80 | 7.405209e-12 | ADD CON 102 | 21 | 37
81 | 7.049732e-10 | REM BND 31 | 20 | 37
82 | 7.529600e-10 | REM BND 29 | 19 | 37
83 | 2.263203e-09 | REM BND 23 | 18 | 37
84 | 4.633543e-10 | REM BND 65 | 17 | 37
85 | 2.631833e-09 | REM BND 17 | 16 | 37
86 | 8.347436e-10 | REM BND 57 | 15 | 37
87 | 6.632452e-10 | REM BND 51 | 14 | 37
88 | 6.761337e-10 | REM BND 45 | 13 | 37
89 | 6.840066e-10 | REM BND 39 | 12 | 37
90 | 6.919302e-10 | REM BND 33 | 11 | 37
91 | 2.228695e-10 | REM BND 37 | 10 | 37
92 | 1.696806e-10 | REM BND 11 | 9 | 37
93 | 3.074410e-10 | REM BND 27 | 8 | 37
94 | 6.170530e-11 | REM BND 9 | 7 | 37
95 | 6.464885e-10 | REM BND 21 | 6 | 37
96 | 7.056954e-10 | REM BND 15 | 5 | 37
97 | 2.602165e-09 | REM BND 43 | 4 | 37
98 | 1.368755e-09 | REM BND 49 | 3 | 37
99 | 3.208089e-11 | REM BND 55 | 2 | 37
100 | 2.300021e-08 | ADD CON 217 | 2 | 38
101 | 1.430245e-08 | ADD CON 194 | 2 | 39
102 | 1.323568e-08 | ADD CON 171 | 2 | 40
103 | 9.688421e-09 | REM BND 61 | 1 | 40
104 | 1.785168e-07 | REM BND 63 | 0 | 40
105 | 3.645625e-03 | REM CON 102 | 0 | 39
106 | 1.325825e-03 | REM CON 171 | 0 | 38
107 | 3.210844e-03 | ADD CON 1 | 0 | 39
108 | 2.274270e-06 | REM CON 5 | 0 | 38
109 | 4.665971e-03 | ADD CON 5 | 0 | 39
110 | 1.806146e-04 | REM CON 79 | 0 | 38
111 | 5.432293e-03 | REM CON 194 | 0 | 37
112 | 2.756412e-03 | REM CON 56 | 0 | 36
113 | 1.435235e-02 | REM CON 244 | 0 | 35
114 | 3.873458e-05 | REM CON 33 | 0 | 34
115 | 1.585118e-02 | REM CON 217 | 0 | 33
116 | 3.384988e-02 | REM CON 10 | 0 | 32
117 | 6.457739e-02 | REM CON 234 | 0 | 31
118 | 3.665566e-01 | ADD CON 243 | 0 | 32
119 | 8.930384e-04 | ADD CON 245 | 0 | 33
120 | 1.990731e-01 | ADD CON 0 | 0 | 34
121 | 2.796725e-03 | ADD CON 2 | 0 | 35
122 | 1.556592e-01 | REM CON 1 | 0 | 34
123 | 2.386773e-02 | ADD CON 23 | 0 | 35
124 | 1.293852e-03 | ADD CON 25 | 0 | 36
125 | 6.727666e-02 | REM CON 3 | 0 | 35
126 | 2.210024e-02 | ADD CON 3 | 0 | 36
127 | 7.405092e-02 | ADD CON 1 | 0 | 37
128 | 6.743344e-04 | REM CON 24 | 0 | 36
129 | 3.415928e-02 | ADD CON 46 | 0 | 37
130 | 6.557692e-03 | ADD CON 48 | 0 | 38
131 | 4.429775e-02 | ADD CON 244 | 0 | 39
132 | 2.216825e-02 | REM CON 26 | 0 | 38
133 | 2.047346e-02 | ADD CON 26 | 0 | 39
134 | 4.311502e-02 | ADD CON 24 | 0 | 40
135 | 1.421877e-03 | ADD CON 246 | 0 | 41
136 | 4.428786e-03 | ADD CON 4 | 0 | 42
137 | 9.312933e-04 | REM CON 47 | 0 | 41
138 | 4.518884e-02 | ADD CON 69 | 0 | 42
139 | 2.702209e-02 | ADD CON 71 | 0 | 43
140 | 3.123203e-02 | REM CON 5 | 0 | 42
141 | 1.600407e-02 | ADD CON 247 | 0 | 43
142 | 2.728007e-03 | REM CON 1 | 0 | 42
143 | 1.208242e-02 | ADD CON 5 | 0 | 43
144 | 9.654815e-04 | ADD CON 47 | 0 | 44
145 | 2.614917e-03 | REM CON 49 | 0 | 43
146 | 1.949132e-02 | ADD CON 49 | 0 | 44
147 | 2.625171e-04 | REM CON 70 | 0 | 43
148 | 3.235717e-02 | ADD CON 27 | 0 | 44
149 | 2.076213e-02 | REM CON 28 | 0 | 43
150 | 2.160900e-02 | ADD CON 22 | 0 | 44
151 | 2.517275e-03 | ADD CON 92 | 0 | 45
152 | 8.208921e-03 | REM CON 51 | 0 | 44
153 | 4.518666e-03 | ADD CON 51 | 0 | 45
154 | 1.956731e-03 | REM CON 27 | 0 | 44
155 | 1.163759e-02 | ADD CON 28 | 0 | 45
156 | 1.573853e-02 | ADD CON 94 | 0 | 46
157 | 2.863122e-03 | ADD CON 1 | 0 | 47
158 | 1.010808e-03 | REM CON 28 | 0 | 46
159 | 2.057084e-03 | ADD CON 28 | 0 | 47
160 | 3.943013e-03 | REM CON 4 | 0 | 46
161 | 7.831652e-03 | ADD CON 27 | 0 | 47
162 | 2.751723e-03 | ADD CON 70 | 0 | 48
163 | 7.650100e-03 | REM CON 28 | 0 | 47
164 | 1.043970e-02 | ADD CON 28 | 0 | 48
165 | 7.271739e-03 | REM CON 93 | 0 | 47
166 | 8.297252e-03 | REM CON 72 | 0 | 46
167 | 1.656183e-02 | ADD CON 72 | 0 | 47
168 | 1.068573e-02 | REM CON 74 | 0 | 46
169 | 1.245084e-03 | REM CON 248 | 0 | 45
170 | 1.707998e-03 | ADD CON 74 | 0 | 46
171 | 4.896335e-02 | ADD CON 50 | 0 | 47
172 | 2.980662e-04 | ADD CON 115 | 0 | 48
173 | 3.021797e-02 | ADD CON 93 | 0 | 49
174 | 2.118919e-02 | ADD CON 117 | 0 | 50
175 | 5.656994e-03 | REM CON 97 | 0 | 49
176 | 1.682961e-03 | ADD CON 97 | 0 | 50
177 | 1.838983e-03 | REM CON 116 | 0 | 49
178 | 3.675879e-02 | REM CON 3 | 0 | 48
179 | 1.454542e-02 | REM CON 95 | 0 | 47
180 | 1.392441e-02 | ADD CON 73 | 0 | 48
181 | 1.629745e-03 | ADD CON 4 | 0 | 49
182 | 3.231978e-03 | ADD CON 95 | 0 | 50
183 | 1.109460e-03 | REM CON 24 | 0 | 49
184 | 4.744743e-02 | ADD CON 116 | 0 | 50
185 | 1.509485e-03 | ADD CON 21 | 0 | 51
186 | 2.168479e-03 | ADD CON 138 | 0 | 52
187 | 1.726914e-02 | REM CON 120 | 0 | 51
188 | 1.904547e-03 | ADD CON 120 | 0 | 52
189 | 1.394208e-02 | ADD CON 3 | 0 | 53
190 | 4.583378e-03 | ADD CON 44 | 0 | 54
191 | 9.410758e-04 | REM CON 139 | 0 | 53
192 | 1.928623e-02 | ADD CON 24 | 0 | 54
193 | 1.353634e-02 | ADD CON 140 | 0 | 55
194 | 9.828789e-03 | REM CON 28 | 0 | 54
195 | 1.317691e-02 | ADD CON 28 | 0 | 55
196 | 2.719782e-02 | REM CON 118 | 0 | 54
197 | 1.225778e-02 | REM CON 27 | 0 | 53
198 | 2.694436e-03 | ADD CON 118 | 0 | 54
199 | 1.948384e-02 | ADD CON 139 | 0 | 55
200 | 5.339655e-03 | ADD CON 161 | 0 | 56
201 | 1.664699e-02 | REM CON 143 | 0 | 55
202 | 1.740547e-03 | ADD CON 143 | 0 | 56
203 | 3.720334e-03 | ADD CON 96 | 0 | 57
204 | 8.161798e-03 | REM CON 162 | 0 | 56
205 | 2.804793e-05 | ADD CON 248 | 0 | 57
206 | 2.923284e-02 | REM CON 21 | 0 | 56
207 | 2.706891e-02 | ADD CON 163 | 0 | 57
208 | 3.023468e-02 | REM CON 47 | 0 | 56
209 | 2.624891e-04 | ADD CON 142 | 0 | 57
210 | 2.044818e-02 | ADD CON 162 | 0 | 58
211 | 4.625494e-03 | ADD CON 67 | 0 | 59
212 | 1.653917e-03 | REM CON 141 | 0 | 58
213 | 1.275297e-02 | ADD CON 141 | 0 | 59
214 | 2.127710e-03 | REM CON 166 | 0 | 58
215 | 1.410490e-03 | REM CON 26 | 0 | 57
216 | 1.076089e-03 | ADD CON 166 | 0 | 58
217 | 2.277180e-02 | ADD CON 184 | 0 | 59
218 | 1.002708e-02 | ADD CON 119 | 0 | 60
219 | 2.602664e-04 | REM CON 185 | 0 | 59
220 | 4.358732e-03 | ADD CON 47 | 0 | 60
221 | 1.057640e-02 | REM CON 142 | 0 | 59
222 | 8.234371e-03 | ADD CON 26 | 0 | 60
223 | 3.578679e-02 | ADD CON 165 | 0 | 61
224 | 1.482147e-02 | ADD CON 186 | 0 | 62
225 | 3.112816e-02 | ADD CON 185 | 0 | 63
226 | 1.257534e-02 | REM CON 189 | 0 | 62
227 | 2.139787e-03 | ADD CON 189 | 0 | 63
228 | 4.475155e-03 | REM CON 164 | 0 | 62
229 | 1.431492e-02 | ADD CON 164 | 0 | 63
230 | 8.729248e-03 | REM CON 212 | 0 | 62
231 | 4.190827e-03 | ADD CON 212 | 0 | 63
232 | 2.034285e-05 | REM CON 208 | 0 | 62
233 | 2.350469e-03 | ADD CON 142 | 0 | 63
234 | 7.640250e-03 | ADD CON 27 | 0 | 64
235 | 2.139692e-02 | REM CON 165 | 0 | 63
236 | 1.213446e-02 | ADD CON 188 | 0 | 64
237 | 3.237549e-03 | ADD CON 207 | 0 | 65
238 | 7.175201e-05 | REM CON 212 | 0 | 64
239 | 5.228848e-04 | ADD CON 212 | 0 | 65
240 | 2.850372e-02 | ADD CON 90 | 0 | 66
241 | 1.050228e-02 | ADD CON 209 | 0 | 67
242 | 1.963842e-02 | REM CON 49 | 0 | 66
243 | 5.686352e-03 | ADD CON 208 | 0 | 67
244 | 9.536259e-03 | REM CON 212 | 0 | 66
245 | 2.223390e-03 | ADD CON 212 | 0 | 67
246 | 2.255394e-02 | REM CON 187 | 0 | 66
247 | 2.337256e-03 | ADD CON 49 | 0 | 67
248 | 6.116239e-03 | ADD CON 211 | 0 | 68
249 | 4.492136e-03 | ADD CON 187 | 0 | 69
250 | 7.103404e-03 | REM CON 212 | 0 | 68
251 | 8.166099e-04 | REM CON 51 | 0 | 67
252 | 9.079063e-03 | ADD CON 212 | 0 | 68
253 | 4.580307e-03 | ADD CON 51 | 0 | 69
254 | 1.725081e-02 | ADD CON 165 | 0 | 70
255 | 1.899687e-02 | REM CON 212 | 0 | 69
256 | 1.156882e-02 | REM CON 70 | 0 | 68
257 | 5.545275e-03 | ADD CON 212 | 0 | 69
258 | 5.972036e-02 | ADD CON 70 | 0 | 70
259 | 5.822507e-06 | REM CON 188 | 0 | 69
260 | 2.286309e-02 | ADD CON 113 | 0 | 70
261 | 1.124471e-02 | REM CON 210 | 0 | 69
262 | 1.175204e-02 | ADD CON 210 | 0 | 70
263 | 1.112306e-02 | REM CON 212 | 0 | 69
264 | 5.567213e-03 | REM CON 72 | 0 | 68
265 | 1.438150e-02 | ADD CON 212 | 0 | 69
266 | 2.015333e-02 | ADD CON 72 | 0 | 70
267 | 7.296921e-02 | ADD CON 188 | 0 | 71
268 | 2.572062e-02 | REM CON 212 | 0 | 70
269 | 9.703857e-03 | ADD CON 212 | 0 | 71
270 | 3.954899e-02 | ADD CON 137 | 0 | 72
271 | 3.144703e-02 | ADD CON 136 | 0 | 73
272 | 1.407455e-02 | REM CON 95 | 0 | 72
273 | 2.932460e-02 | ADD CON 95 | 0 | 73
274 | 1.392843e-01 | ADD CON 160 | 0 | 74
275 | 5.081134e-02 | REM CON 118 | 0 | 73
276 | 2.616955e-02 | ADD CON 118 | 0 | 74
277 | 1.585479e-01 | ADD CON 183 | 0 | 75
278 | 5.625643e-02 | REM CON 141 | 0 | 74
279 | 1.659738e-02 | REM CON 74 | 0 | 73
280 | 8.647233e-03 | ADD CON 141 | 0 | 74
281 | 1.635669e-03 | ADD CON 74 | 0 | 75
282 | 2.306423e-01 | ADD CON 206 | 0 | 76
283 | 4.255722e-02 | ADD CON 159 | 0 | 77
284 | 1.398526e-02 | REM CON 164 | 0 | 76
285 | 2.496963e-02 | ADD CON 164 | 0 | 77
286 | 6.560276e-02 | REM CON 93 | 0 | 76
287 | 7.091343e-02 | ADD CON 93 | 0 | 77
288 | 2.770174e-01 | ADD CON 45 | 0 | 78
289 | 9.462614e-02 | REM CON 211 | 0 | 77
290 | 1.373792e-02 | REM CON 50 | 0 | 76
291 | 5.253172e-02 | REM CON 187 | 0 | 75
292 | 2.957858e-02 | ADD CON 187 | 0 | 76
293 | 1.129962e-01 | ADD CON 229 | 0 | 77
294 | 1.332607e-01 | ADD CON 50 | 0 | 78
295 | 4.814664e-02 | ADD CON 211 | 0 | 79
296 | 7.323369e-01 | REM CON 97 | 0 | 78
297 | 5.651252e-03 | ADD CON 97 | 0 | 79
298 | 1.000000e+00 | QP SOLVED | 0 | 79
1 9.681463e+01 6.66e-16 2.84e-14 1.25e+00 - 1
MESSAGE(sqpmethod): Convergence achieved after 1 iterations
solver : t_proc (avg) t_wall (avg) n_eval
QP | 61.97ms ( 61.97ms) 35.20ms ( 35.20ms) 1
linesearch | 15.00us ( 15.00us) 15.11us ( 15.11us) 1
nlp_fg | 9.00us ( 9.00us) 8.95us ( 8.95us) 1
nlp_grad | 15.00us ( 15.00us) 14.67us ( 14.67us) 1
nlp_hess_l | 23.00us ( 23.00us) 2.25us ( 2.25us) 1
nlp_jac_fg | 143.00us ( 71.50us) 22.93us ( 11.47us) 2
total | 62.53ms ( 62.53ms) 35.30ms ( 35.30ms) 1
0 -th step.
[0.19143743 0.19999999]
[-0.9990555 0.03623872 0.02441309 0.46774115 0.08898196 1.70557299]
-11.318308612350087
False
{'goal_reached': False, 'mse': 1.9498817124165235, 'constraint_values': array([-1.00094450e+00, -3.40282347e+38, -7.44130887e-02, -3.40282347e+38,
-1.57251177e+00, -3.40282347e+38, -2.99905550e+00, -3.40282347e+38,
-1.97558691e+00, -3.40282347e+38, -1.39454785e+00, -3.40282347e+38,
-1.91437431e-01, -1.99999990e-01, -8.56256923e-03, -9.99999999e-09]), 'constraint_violation': 0}
iter objective inf_pr inf_du ||d|| lg(rg) ls 0 9.682634e+01 1.71e+00 1.01e+01 0.00e+00 - 0
#################### qpOASES -- QP NO. 2 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 5.744546e-02 | REM CON 97 | 0 | 78
1 | 9.210616e-17 | ADD CON 97 | 0 | 79
2 | 6.806201e-02 | REM CON 159 | 0 | 78
3 | 5.996598e-04 | REM CON 45 | 0 | 77
4 | 8.439276e-02 | REM CON 50 | 0 | 76
5 | 2.748029e-01 | REM CON 229 | 0 | 75
6 | 5.355555e-01 | ADD CON 50 | 0 | 76
7 | 2.562778e-02 | REM CON 187 | 0 | 75
8 | 3.845621e-02 | REM CON 93 | 0 | 74
9 | 2.182160e-01 | ADD CON 93 | 0 | 75
10 | 2.271468e-01 | ADD CON 187 | 0 | 76
11 | 4.648568e-01 | REM CON 74 | 0 | 75
12 | 1.750064e-02 | ADD CON 74 | 0 | 76
13 | 4.351012e-01 | REM CON 211 | 0 | 75
14 | 5.412789e-02 | REM CON 248 | 0 | 74
15 | 3.168153e-01 | ADD CON 248 | 0 | 75
16 | 8.910957e-02 | ADD CON 211 | 0 | 76
17 | 4.478124e-01 | ADD CON 21 | 0 | 77
18 | 2.163341e-01 | REM CON 27 | 0 | 76
19 | 4.937291e-02 | ADD CON 27 | 0 | 77
20 | 2.086708e-01 | REM CON 22 | 0 | 76
21 | 8.416534e-01 | ADD CON 114 | 0 | 77
22 | 1.000000e+00 | QP SOLVED | 0 | 77
1 6.905100e+01 1.11e-15 2.84e-14 1.71e+00 - 1
MESSAGE(sqpmethod): Convergence achieved after 1 iterations
solver : t_proc (avg) t_wall (avg) n_eval
QP | 3.93ms ( 3.93ms) 3.94ms ( 3.94ms) 1
linesearch | 14.00us ( 14.00us) 13.88us ( 13.88us) 1
nlp_fg | 8.00us ( 8.00us) 8.09us ( 8.09us) 1
nlp_grad | 13.00us ( 13.00us) 13.07us ( 13.07us) 1
nlp_hess_l | 2.00us ( 2.00us) 1.67us ( 1.67us) 1
nlp_jac_fg | 20.00us ( 10.00us) 18.62us ( 9.31us) 2
total | 4.01ms ( 4.01ms) 4.01ms ( 4.01ms) 1
1 -th step.
[0.19999999 0.19420062]
[-0.98577947 0.24823515 0.09532049 0.92651825 0.19846765 0.53545231]
-9.574529339871601
False
{'goal_reached': False, 'mse': 1.7902061840546073, 'constraint_values': array([-1.01422053e+00, -3.40282347e+38, -1.45320487e-01, -3.40282347e+38,
-1.68199746e+00, -3.40282347e+38, -2.98577947e+00, -3.40282347e+38,
-1.90467951e+00, -3.40282347e+38, -1.28506215e+00, -3.40282347e+38,
-1.99999990e-01, -1.94200623e-01, -9.99999999e-09, -5.79937722e-03]), 'constraint_violation': 0}
iter objective inf_pr inf_du ||d|| lg(rg) ls 0 6.906276e+01 1.17e+00 9.99e+00 0.00e+00 - 0
#################### qpOASES -- QP NO. 3 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 9.327452e-02 | REM CON 51 | 0 | 76
1 | 2.896507e-17 | ADD CON 51 | 0 | 77
2 | 3.451148e-02 | REM CON 113 | 0 | 76
3 | 2.273130e-03 | REM CON 136 | 0 | 75
4 | 3.532828e-02 | REM CON 160 | 0 | 74
5 | 1.555761e-01 | REM CON 183 | 0 | 73
6 | 3.209650e-01 | REM CON 70 | 0 | 72
7 | 2.698751e-02 | REM CON 206 | 0 | 71
8 | 1.325484e-01 | REM CON 118 | 0 | 70
9 | 1.755746e-01 | REM CON 141 | 0 | 69
10 | 1.382022e-02 | ADD CON 160 | 0 | 70
11 | 2.054029e-01 | ADD CON 70 | 0 | 71
12 | 3.057243e-02 | ADD CON 141 | 0 | 72
13 | 1.234219e-02 | REM CON 164 | 0 | 71
14 | 1.551252e-02 | ADD CON 183 | 0 | 72
15 | 1.691767e-01 | ADD CON 164 | 0 | 73
16 | 3.000990e-02 | ADD CON 118 | 0 | 74
17 | 5.758738e-02 | REM CON 183 | 0 | 73
18 | 6.461810e-02 | REM CON 95 | 0 | 72
19 | 2.248397e-01 | REM CON 160 | 0 | 71
20 | 2.158103e-01 | ADD CON 87 | 0 | 72
21 | 3.755095e-02 | REM CON 248 | 0 | 71
22 | 1.270546e-02 | REM CON 28 | 0 | 70
23 | 7.672153e-03 | ADD CON 28 | 0 | 71
24 | 5.617051e-05 | REM CON 48 | 0 | 70
25 | 6.701472e-02 | ADD CON 64 | 0 | 71
26 | 6.479426e-03 | ADD CON 248 | 0 | 72
27 | 2.541856e-05 | REM CON 25 | 0 | 71
28 | 4.126423e-01 | ADD CON 41 | 0 | 72
29 | 1.057060e-05 | REM CON 246 | 0 | 71
30 | 2.068506e-02 | ADD CON 95 | 0 | 72
31 | 8.307346e-02 | REM CON 2 | 0 | 71
32 | 1.474649e-01 | ADD CON 25 | 0 | 72
33 | 4.303787e-03 | ADD CON 246 | 0 | 73
34 | 2.735864e-03 | REM CON 41 | 0 | 72
35 | 2.206961e-01 | ADD CON 2 | 0 | 73
36 | 3.416706e-05 | REM CON 21 | 0 | 72
37 | 5.667259e-02 | ADD CON 160 | 0 | 73
38 | 4.339598e-02 | ADD CON 48 | 0 | 74
39 | 9.273479e-03 | REM CON 245 | 0 | 73
40 | 2.893517e-01 | ADD CON 245 | 0 | 74
41 | 4.146509e-02 | REM CON 87 | 0 | 73
42 | 4.809294e-02 | ADD CON 113 | 0 | 74
43 | 3.672651e-02 | REM CON 118 | 0 | 73
44 | 2.199743e-02 | ADD CON 118 | 0 | 74
45 | 2.056227e-01 | ADD CON 183 | 0 | 75
46 | 1.585555e-01 | ADD CON 41 | 0 | 76
47 | 1.859697e-02 | REM CON 28 | 0 | 75
48 | 4.142917e-04 | ADD CON 28 | 0 | 76
49 | 5.718643e-03 | REM CON 25 | 0 | 75
50 | 1.387979e-02 | REM CON 141 | 0 | 74
51 | 9.780109e-03 | ADD CON 141 | 0 | 75
52 | 1.294041e-01 | ADD CON 25 | 0 | 76
53 | 2.049099e-02 | REM CON 64 | 0 | 75
54 | 9.626207e-02 | ADD CON 206 | 0 | 76
55 | 1.058965e-01 | REM CON 164 | 0 | 75
56 | 1.008382e-02 | ADD CON 164 | 0 | 76
57 | 4.951753e-01 | ADD CON 136 | 0 | 77
58 | 3.125920e-01 | REM CON 51 | 0 | 76
59 | 1.631619e-03 | ADD CON 51 | 0 | 77
60 | 8.536231e-02 | REM CON 187 | 0 | 76
61 | 1.270538e-02 | ADD CON 187 | 0 | 77
62 | 1.869176e-01 | ADD CON 229 | 0 | 78
63 | 5.755635e-01 | ADD CON 91 | 0 | 79
64 | 1.000000e+00 | QP SOLVED | 0 | 79
1 6.411921e+01 5.55e-15 1.14e-13 1.17e+00 - 1
MESSAGE(sqpmethod): Convergence achieved after 1 iterations
solver : t_proc (avg) t_wall (avg) n_eval
QP | 10.39ms ( 10.39ms) 10.69ms ( 10.69ms) 1
linesearch | 14.00us ( 14.00us) 14.25us ( 14.25us) 1
nlp_fg | 9.00us ( 9.00us) 8.36us ( 8.36us) 1
nlp_grad | 14.00us ( 14.00us) 14.57us ( 14.57us) 1
nlp_hess_l | 2.00us ( 2.00us) 1.67us ( 1.67us) 1
nlp_jac_fg | 18.00us ( 9.00us) 18.61us ( 9.31us) 2
total | 10.46ms ( 10.46ms) 10.77ms ( 10.77ms) 1
2 -th step.
[0.04439637 0.04017326]
[-0.95645846 0.33804883 0.15788519 0.34889909 0.25184205 0.53217667]
-8.411749986610035
False
{'goal_reached': False, 'mse': 1.62397014661692, 'constraint_values': array([-1.04354154e+00, -3.40282347e+38, -2.07885191e-01, -3.40282347e+38,
-1.73537186e+00, -3.40282347e+38, -2.95645846e+00, -3.40282347e+38,
-1.84211481e+00, -3.40282347e+38, -1.23168776e+00, -3.40282347e+38,
-4.43963687e-02, -4.01732595e-02, -1.55603631e-01, -1.59826741e-01]), 'constraint_violation': 0}
iter objective inf_pr inf_du ||d|| lg(rg) ls 0 6.412483e+01 9.80e-01 9.86e+00 0.00e+00 - 0
#################### qpOASES -- QP NO. 4 #####################
Iter | StepLength | Info | nFX | nAC
----------+------------------+------------------+---------+---------
0 | 1.942278e-01 | REM CON 136 | 0 | 78
1 | 1.068916e-01 | REM CON 4 | 0 | 77
2 | 5.034360e-15 | ADD CON 4 | 0 | 78
3 | 5.022337e-02 | REM CON 51 | 0 | 77
WARNING: Stepsize is 0.000000000000000e+00
4 | 0.000000e+00 | ADD CON 51 | 0 | 78
5 | 6.522466e-01 | REM CON 229 | 0 | 77
6 | 2.243194e-01 | REM CON 47 | 0 | 76
7 | 6.435181e-01 | REM CON 187 | 0 | 75
8 | 1.046850e-01 | ADD CON 47 | 0 | 76
9 | 4.217880e-01 | ADD CON 187 | 0 | 77
WARNING: Stepsize is 0.000000000000000e+00
WARNING: Stepsize is 0.000000000000000e+00
WARNING: Stepsize is 0.000000000000000e+00
WARNING: Stepsize is 0.000000000000000e+00
ERROR: Premature homotopy termination because QP is infeasible
Error in Opti::solve [OptiNode] at .../casadi/core/optistack.cpp:159:
Error in Function::operator() for 'qpsol' [QpoasesInterface] at .../casadi/core/function.cpp:1368:
.../casadi/core/conic.cpp:525: conic process failed. Set 'error_on_fail' option to false to ignore this error.
Infeasible MPC Problem
#########################################
Loading GP dimension 0
######################################### Path: ./trained_gp_model/best_model_0.pth Loaded! #########################################
Loading GP dimension 1
######################################### Path: ./trained_gp_model/best_model_1.pth Loaded! #########################################
Loading GP dimension 2
######################################### Path: ./trained_gp_model/best_model_2.pth Loaded! #########################################
Loading GP dimension 3
######################################### Path: ./trained_gp_model/best_model_3.pth Loaded! #########################################
Loading GP dimension 4
######################################### Path: ./trained_gp_model/best_model_4.pth Loaded! #########################################
Loading GP dimension 5
######################################### Path: ./trained_gp_model/best_model_5.pth Loaded! Init State: [-1. 0. 0. 0. -0. 0.] CasADi - 2022-09-13 16:26:47 MESSAGE("nlp::init") [.../casadi/core/function_internal.cpp:477] CasADi - 2022-09-13 16:26:47 MESSAGE("nlp::init") [.../casadi/core/x_function.hpp:281] CasADi - 2022-09-13 16:26:47 MESSAGE("nlp::init") [.../casadi/core/mx_function.cpp:102] CasADi - 2022-09-13 16:26:47 MESSAGE("Using live variables: work array is 27 instead of 577") [.../casadi/core/mx_function.cpp:305] CasADi - 2022-09-13 16:26:47 MESSAGE("solver::init") [.../casadi/core/function_internal.cpp:477] CasADi - 2022-09-13 16:26:47 MESSAGE("solver::create_function nlp_grad:[x, p, lam:f, lam:g]->[f, g, grad:gamma:x, grad:gamma:p]") [.../casadi/core/oracle_function.cpp:132] CasADi - 2022-09-13 16:26:48 MESSAGE("solver::create_function nlp_f:[x, p]->[f]") [.../casadi/core/oracle_function.cpp:132] CasADi - 2022-09-13 16:26:48 MESSAGE("solver::create_function nlp_g:[x, p]->[g]") [.../casadi/core/oracle_function.cpp:132] CasADi - 2022-09-13 16:26:48 MESSAGE("solver::create_function nlp_grad_f:[x, p]->[f, grad:f:x]") [.../casadi/core/oracle_function.cpp:132] CasADi - 2022-09-13 16:26:48 MESSAGE("solver::create_function nlp_jac_g:[x, p]->[g, jac:g:x]") [.../casadi/core/oracle_function.cpp:132] CasADi - 2022-09-13 16:26:48 MESSAGE("solver::create_function nlp_hess_l:[x, p, lam:f, lam:g]->[hess:gamma:x:x]") [.../casadi/core/oracle_function.cpp:132] There are 86 variables and 335 constraints. Using exact Hessian Total number of variables............................: 86 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 66 Total number of inequality constraints...............: 269 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 86 inequality constraints with only upper bounds: 183
Number of Iterations....: 100
(scaled) (unscaled)
Objective...............: 1.4793434887362989e+02 1.4793434887362989e+02 Dual infeasibility......: 5.9408874591127736e+06 5.9408874591127736e+06 Constraint violation....: 1.7967820698854311e+00 2.4033257283054743e+00 Complementarity.........: 7.7994891085978524e-01 7.7994891085978524e-01 Overall NLP error.......: 2.9108033978286326e+03 5.9408874591127736e+06
Number of objective function evaluations = 101 Number of objective gradient evaluations = 101 Number of equality constraint evaluations = 101 Number of inequality constraint evaluations = 101 Number of equality constraint Jacobian evaluations = 101 Number of inequality constraint Jacobian evaluations = 101 Number of Lagrangian Hessian evaluations = 100 Total CPU secs in IPOPT (w/o function evaluations) = 0.110 Total CPU secs in NLP function evaluations = 0.988
EXIT: Maximum Number of Iterations Exceeded. solver : t_proc (avg) t_wall (avg) n_eval nlp_f | 362.00us ( 3.58us) 358.95us ( 3.55us) 101 nlp_g | 65.91ms (652.59us) 65.98ms (653.29us) 101 nlp_grad | 2.78ms ( 2.78ms) 2.78ms ( 2.78ms) 1 nlp_grad_f | 372.00us ( 3.65us) 374.08us ( 3.67us) 102 nlp_hess_l | 695.80ms ( 6.96ms) 696.06ms ( 6.96ms) 100 nlp_jac_g | 228.97ms ( 2.24ms) 229.22ms ( 2.25ms) 102 total | 1.11 s ( 1.11 s) 1.11 s ( 1.11 s) 1 GP Mean eq Contribution: [-3.88652854e-04 2.45557454e-02 2.85799483e-02 4.55388601e-01 -1.52168405e-03 5.16830598e-03] True GP value: [-3.93645639e-04 2.33940861e-02 2.85505519e-02 4.66247972e-01 -1.40092526e-03 4.32021969e-03] GP SELECT ACTION TIME: 3.138321131002158 0 -th step. [0.24181908 0.23957112] [-1.00030505 -0.01170715 0.04174303 0.80051446 -0.02337966 -0.44847958] -10.0168287032323 True {'goal_reached': False, 'mse': 1.9188666249301738, 'constraint_values': array([-9.99694948e-01, -3.40282347e+38, -9.17430266e-02, -3.40282347e+38, -1.46015014e+00, -3.40282347e+38, -3.00030505e+00, -3.40282347e+38, -1.95825697e+00, -3.40282347e+38, -1.50690947e+00, -3.40282347e+38, -2.41819078e-01, -2.39571116e-01, 4.18190776e-02, 3.95711161e-02]), 'constraint_violation': 1}
Hi,
Thanks for pointing this out. We've made some changes lately and GPMPC looks like it's a little unstable in this example. Can you tell me what machine you are running this on, and I'm assuming the main branch? Also, could you list the python packages you have installed and their versions running something like
pip3 list or conda list
Thanks for your timely response! I am running on Ubuntu 18.04 and created env using conda
Here are the packages and their versions
Name Version Build Channel
_libgcc_mutex 0.1 main https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main _openmp_mutex 5.1 1_gnu https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main absl-py 1.2.0 pypi_0 pypi aiosignal 1.2.0 pypi_0 pypi attrs 22.1.0 pypi_0 pypi ca-certificates 2022.07.19 h06a4308_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main cachetools 5.2.0 pypi_0 pypi casadi 3.5.5 pypi_0 pypi certifi 2022.6.15 py38h06a4308_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main charset-normalizer 2.1.1 pypi_0 pypi click 8.0.4 pypi_0 pypi cloudpickle 2.2.0 pypi_0 pypi cvxpy 1.2.1 pypi_0 pypi cycler 0.11.0 pypi_0 pypi dict-deep 4.1.2 pypi_0 pypi distlib 0.3.6 pypi_0 pypi ecos 2.0.10 pypi_0 pypi filelock 3.8.0 pypi_0 pypi fonttools 4.37.1 pypi_0 pypi frozenlist 1.3.1 pypi_0 pypi google-auth 2.11.0 pypi_0 pypi google-auth-oauthlib 0.4.6 pypi_0 pypi gpytorch 1.9.0 pypi_0 pypi grpcio 1.43.0 pypi_0 pypi gym 0.21.0 pypi_0 pypi idna 3.3 pypi_0 pypi imageio 2.21.3 pypi_0 pypi importlib-metadata 4.12.0 pypi_0 pypi importlib-resources 5.9.0 pypi_0 pypi joblib 1.1.0 pypi_0 pypi jsonschema 4.16.0 pypi_0 pypi kiwisolver 1.4.4 pypi_0 pypi ld_impl_linux-64 2.38 h1181459_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main libffi 3.3 he6710b0_2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main libgcc-ng 11.2.0 h1234567_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main libgomp 11.2.0 h1234567_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main libstdcxx-ng 11.2.0 h1234567_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main linear-operator 0.1.1 pypi_0 pypi markdown 3.4.1 pypi_0 pypi markupsafe 2.1.1 pypi_0 pypi matplotlib 3.5.3 pypi_0 pypi mosek 9.3.21 pypi_0 pypi msgpack 1.0.4 pypi_0 pypi munch 2.5.0 pypi_0 pypi ncurses 6.3 h5eee18b_3 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main numpy 1.23.3 pypi_0 pypi oauthlib 3.2.1 pypi_0 pypi openssl 1.1.1q h7f8727e_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main osqp 0.6.2.post5 pypi_0 pypi packaging 21.3 pypi_0 pypi pandas 1.4.4 pypi_0 pypi pillow 9.2.0 pypi_0 pypi pip 22.2.2 pypi_0 pypi pkgutil-resolve-name 1.3.10 pypi_0 pypi platformdirs 2.5.2 pypi_0 pypi protobuf 3.19.4 pypi_0 pypi pyaml 21.10.1 pypi_0 pypi pyasn1 0.4.8 pypi_0 pypi pyasn1-modules 0.2.8 pypi_0 pypi pybullet 3.2.5 pypi_0 pypi pycddlib 2.1.6 pypi_0 pypi pyparsing 3.0.9 pypi_0 pypi pyrsistent 0.18.1 pypi_0 pypi python 3.8.10 h12debd9_8 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main python-dateutil 2.8.2 pypi_0 pypi pytope 0.0.4 pypi_0 pypi pytz 2022.2.1 pypi_0 pypi pyyaml 6.0 pypi_0 pypi qdldl 0.1.5.post2 pypi_0 pypi ray 1.13.0 pypi_0 pypi readline 8.1.2 h7f8727e_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main requests 2.28.1 pypi_0 pypi requests-oauthlib 1.3.1 pypi_0 pypi rsa 4.9 pypi_0 pypi safe-control-gym 0.5.0 pypi_0 pypi scikit-learn 1.1.2 pypi_0 pypi scikit-optimize 0.9.0 pypi_0 pypi scipy 1.9.1 pypi_0 pypi scs 3.2.0 pypi_0 pypi setuptools 63.4.1 py38h06a4308_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main six 1.16.0 pypi_0 pypi sqlite 3.39.2 h5082296_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main tensorboard 2.10.0 pypi_0 pypi tensorboard-data-server 0.6.1 pypi_0 pypi tensorboard-plugin-wit 1.8.1 pypi_0 pypi termcolor 1.1.0 pypi_0 pypi threadpoolctl 3.1.0 pypi_0 pypi tk 8.6.12 h1ccaba5_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main torch 1.12.1 pypi_0 pypi typing-extensions 4.3.0 pypi_0 pypi urllib3 1.26.12 pypi_0 pypi virtualenv 20.16.5 pypi_0 pypi werkzeug 2.2.2 pypi_0 pypi wheel 0.37.1 pyhd3eb1b0_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main xz 5.2.5 h7f8727e_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main zipp 3.8.1 pypi_0 pypi zlib 1.2.12 h5eee18b_3 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
Thanks for that! So far I haven't been able to recreate your issue (I'm on Ubuntu 20.04) and the create_fig6.sh runs fine for me on the main branch, however, we have seen differences in performance across machines before simply due to torch and different optimizers used by casadi. I'll do some more digging later in the week to see if I can recreate it. Just to confirm a couple of things:
- you are on the
mainbranch, right? - Did you run
create_trained_gp_model.shfirst? This isn't required but could be related to why you are getting the error. - Did you change any of the yaml files in
config_overrides? - What python version are you using? Generally, everything is confirmed to work with python 3.8.1
Thanks for your patience!
- my branch is up to date with 'origin/main'.
- I ran that script, but it still did not work.
- I did not modify the yaml files.
- I create conda env with python 3.8.10, as you mentioned in README.md (https://github.com/utiasDSL/safe-control-gym#option-a-recommended-using-conda)
Hey, so far I've tried on a couple of different machines but haven't been able to recreate your issue (so far its working as expected on all the ones I've tried), but I haven't gotten my hands on an Ubuntu 18 machine yet. Just wanted to let you know I'm still working on it.
One thing you could try:
- Checkout the
mainagain to make sure it's up-to-date with origin/main again (in particular make sure all the files in https://github.com/utiasDSL/safe-control-gym/tree/main/experiments/annual_reviews/figure6/trained_gp_model are the same as what's in the repo. - Change this line in gp_mpc.py
https://github.com/utiasDSL/safe-control-gym/blob/5ebaa31581d70c4f2ef0f96d119fd82a0dfeadc2/safe_control_gym/controllers/mpc/gp_mpc.py#L547
to read
"ipopt.max_iter": 10000, - Rerun
./create_fig6.shagain from the figure6 directory (do not runcreate_trained_gp_model.shfirst).
Basically, the problem you are getting is that gpmpc is finding an infeasible solution. Usually, this has happened for me when the GP is poorly trained. However, it seems like the model files are okay as they have worked well for me on other machines, so we are letting gp_mpc run for more iterations to see if this will help the optimization. It's possible the solver ipopt is using some different binary files in the background giving us different optimization performance (but I have to dig into this a little more).
Closed for inactivity