`MUL_LTIME` returns different values depending on if the stdlib is built with `debug` or `release`
Describe the bug Given the following example:
{external}
FUNCTION printf : DINT
VAR_INPUT {ref}
format : STRING;
END_VAR
VAR_INPUT
args : ...;
END_VAR
END_FUNCTION
FUNCTION main : DINT
VAR
l : LTIME;
END_VAR
l := MUL_LTIME(LTIME#1d, SINT#-120); // expecting 8.64e+13 * -120 = -1.0368e+16
printf('%lld$N', l);
END_FUNCTION
When compiled with the stdlib debug build, prints -10368000000000000, which is the expected value.
When recompiled with the release build of the stdlib, it now prints 11750400000000000.
It does not make a difference if the compiler itself is built with debug or release.
It also does not make a difference to compile with O none or O default.
To Reproduce Steps to reproduce the behavior:
export LD_LIBRARY_PATH=<path_to_lib>
plc <filename.st> -liec61131std -L<path_to_lib> -i libs/stdlib/iec61131-st/*.st --linker=clang
./<filename.st>.out
plc <filename.st> -liec61131std_DBG -L<path_to_lib> -i libs/stdlib/iec61131-st/*.st --linker=clang
./<filename.st>.out
Expected behavior Running either program should print the correct value.
Adding a println!("{in2}"); to date_time_numeric_functions::MUL_LTIME__SINT before checked_mul_time_with_signed_int(in1, in2.into()) is called results in the correct value. When removing it again, the value reverts back to 11750400000000000. This seems to be reproducible, but I'd welcome it if somebody else could confirm this. There is similar behaviour in other tests, as outlined in #1122