Compiler computes wrong constant number
Example code:
open Js
let min = -2147483648
let max = 2147483647
let value = 1000000000
Console.log(min->Int.toFloat -. value->Int.toFloat)
Console.log(max->Int.toFloat +. value->Int.toFloat)
Output Javascript:
console.log(1147483648);
console.log(-1147483649);
var min = -2147483648;
var max = 2147483647;
var value = 1000000000;
export {
min ,
max ,
value ,
}
Seems that the compiler computes these numbers as int32.
Ints are actually considered 32-bit according to the language manual: https://rescript-lang.org/docs/manual/latest/primitive-types#integers
And as the manual says, if you need 64 bit precision, use float instead.
@glennsl But the example code has converted the int32 numbers to float already. The compiler treats these float numbers as int32.
Hmm, indeed. I was thinking the conversion itself being an int32 operation would truncate the numbers, but at that point they are of course still valid int32 numbers.
It's also interesting that casting (i.e. asserting that it's a different type) yields the same result:
external toFloat: int => float = "%identity"
let min = -2147483648->toFloat
let value = 1000000000->toFloat
Js.log(min -. value)
console.log(1147483648);
var min = -2147483648;
var value = 1000000000;
while using float literals results in the constant expression being left unevaluated:
let min = -2147483648.
let value = 1000000000.
Js.log(min -. value)
console.log(-2147483648 - 1000000000);
var min = -2147483648;
var value = 1000000000
Definitely something funky going on.
Possibly inlinig? Try replacing constants with variables see if there's a problem.