5.0.0-beta.1 - setting Settings.init().setLCG(true) JUnit case doesn't work
Try to use 5.0.0-beta.1.
If I'm setting Settings.init().setLCG(true) to the model here:
- https://github.com/axkr/symja_android_library/blob/fadbe4c04dfe1e3ea8ba3bfc5d97b5663b2c702e/symja_android_library/matheclipse-core/src/main/java/org/matheclipse/core/convert/ChocoConvert.java#L164
The following JUnit case doesn't work anymore:
- https://github.com/axkr/symja_android_library/blob/fadbe4c04dfe1e3ea8ba3bfc5d97b5663b2c702e/symja_android_library/matheclipse-core/src/test/java/org/matheclipse/core/system/SolveTest.java#L1592
@Test
public void testIssue685() {
check("Solve({a*b + c == 2020, a + b*c == 2021},{a,b,c},Integers)", //
"{{a->673,b->2,c->674},{a->2021,b->0,c->2020}}");
}
Is there a method to decide then to use Settings.init().setLCG(true)?
Can you tell me what is the real domain of a,b,c when set to Integers ?
Can you tell me what is the real domain of
a,b,cwhen set toIntegers?
intVar = model.intVar(//
expr.toString(), //
CHOCO_MIN_VALUE, //
CHOCO_MAX_VALUE);
with
final public static short CHOCO_MIN_VALUE = Short.MIN_VALUE / 2;
final public static short CHOCO_MAX_VALUE = Short.MAX_VALUE / 2;
If I convert your model to pure choco one, I got:
@Test(groups = "lcg", timeOut = 60000)
public void test685(){
//{a*b + c == 2020, a + b*c == 2021}
Model model = new Model(Settings.init().setLCG(true));
IntVar a = model.intVar("a",Short.MIN_VALUE / 2,Short.MAX_VALUE / 2);
IntVar b = model.intVar("b",Short.MIN_VALUE / 2,Short.MAX_VALUE / 2);
IntVar c = model.intVar("c",Short.MIN_VALUE / 2,Short.MAX_VALUE / 2);
a.mul(b).add(c).eq(2020).post();
a.add(b.mul(c)).eq(2021).post();
Solver solver = model.getSolver();
solver.showShortStatistics();
while (solver.solve()) {
System.out.printf("a=%d b=%d c=%d%n", a.getValue(), b.getValue(), c.getValue());
}
Assert.assertEquals(solver.getSolutionCount(), 2);
}
I ran the tests on the current dev branch, namely 5.0.0-beta.2
When I set lcg to false, I got the following output:
Model[Model-0], 1 Solutions, Resolution time 0,257s, 28731 Nodes (111 616,7 n/s), 28730 Backtracks, 0 Backjumps, 28729 Fails, 0 Restarts, 207390 Propagations
a=2021 b=0 c=2020
Model[Model-0], 2 Solutions, Resolution time 0,281s, 30079 Nodes (107 183,1 n/s), 30077 Backtracks, 0 Backjumps, 30075 Fails, 0 Restarts, 275960 Propagations
a=673 b=2 c=674
Model[Model-0], 2 Solutions, Resolution time 0,296s, 31762 Nodes (107 195,0 n/s), 63521 Backtracks, 0 Backjumps, 31759 Fails, 0 Restarts, 300940 Propagations
When I set it to false, I got:
Model[Model-0], 1 Solutions, Resolution time 39,494s, 33533 Nodes (849,1 n/s), 33531 Backtracks, 0 Backjumps, 33531 Fails, 0 Restarts, 621278 Propagations
a=673 b=2 c=674
Model[Model-0], 2 Solutions, Resolution time 39,532s, 33556 Nodes (848,8 n/s), 33553 Backtracks, 0 Backjumps, 33552 Fails, 0 Restarts, 621643 Propagations
a=2021 b=0 c=2020
Model[Model-0], 2 Solutions, Resolution time 40,191s, 33557 Nodes (834,9 n/s), 33556 Backtracks, 0 Backjumps, 33554 Fails, 0 Restarts, 744676 Propagations
Expect for the time required, which is related to the underlying MiniSat propagation, the solutions are found. So, I suppose that the test fails due to the resolution time ?
Well, it seems that this simple problem is really not adapted to LCG : I tried with Chuffed and OR-Tools CP-SAT (with a minizinc model) and both of them take ages to find one solution.