boa
boa copied to clipboard
Integer conversion conventions
I opened this issue as suggested by this comment: https://github.com/boa-dev/boa/pull/1971#issuecomment-1077297915
As I've been going through the conversion code from Rust integer types to JsValue, I've come across some conversions that don't make sense to me. I'd like to open this for discussion.
My biggest concern is i128, u128, i64, and u64. I don't see why all of those shouldn't be converted into a JsValue::BigInt. A bigint in JavaScript is the only native number type that can maintain true integer semantics and hold all possible values of i128, u128, i64, and u64, thus why should we not always convert to a bigint?
The thing here is that this is a fairly complex and very subjective topic.
JavaScript has 2 numeric types:
- "number" ->
f64in Rust - "bigint" ->
BigIntin Rust
There is nothing else in JavaScript. So, it's definitely very difficult for us to provide the conversions you mention. Since, for example, a i32 might make sense to be converted to "number", since it won't loose precision, and it can be stored in the stack. i128, on the other hand, cannot be converted to JavaScript the same way as Rust uses it. Either it gets converted to f64, and therefore loses precision, or it gets converted to bigint, and this requires an allocation in the stack.
Why should Boa force any of the two options on Boa users?
We should provide a f64 to JsValue and a BigInt to JsValue conversion. These two don't fail and can be represented in a straightforward way. This can be done with the From / Into traits. Boa has also a special performance optimization that allows you to use a i32 as if it were a "number". You can already convert an arbitrary i32 to a "number":
let num = 1256_i32;
let js_num = JsValue::Integer(num);
For other types, we shouldn't force anything. Users can easily (even now) convert any arbitrary Rust type to a JsValue (by implementing the logic, of course), and they can get a JsValue and decide how they can represent it in Rust. So, if I think i128 is best represented by the "number" JavaScript type, since it's stored in the stack the same way as Rust does, I can do the following:
let num = 1256_i128;
let js_num = JsValue::Rational(num as f64);
If, on the other hand, I prefer the precision, I can do the following:
let num = 1256_i128;
let js_bigint = JsValue::BigInt(BigInt::from(num).into());
It might make sense to add From<i128> conversions to JsBigInt, to make this easier, but not for JsValue. In any case, I don't think Boa should force anything on its users for converting i128 or other integer types to JsValue. Why is this even needed?
i128, on the other hand, cannot be converted to JavaScript the same way as Rust uses it. Either it gets converted to f64, and therefore loses precision, or it gets converted to bigint, and this requires an allocation in the stack.
Why should Boa force any of the two options on Boa users?
Because only one is correct. An i128 should be converted to a bigint by default because that's the only native type that won't lose precision and thus is the correct choice.
I wouldn't want to make it impossible to convert to another value, but by default shouldn't we convert numbers so that they do not lose precision? That seems to me to be the ultimate goal in a number conversion generally (or at least it's my ultimate goal for my project).
It might make sense to add From
conversions to JsBigInt, to make this easier, but not for JsValue. In any case, I don't think Boa should force anything on its users for converting i128 or other integer types to JsValue. Why is this even needed?
I'm working on a generalized solution (basically a TypeScript to Rust compiler) that takes a developer's TypeScript and generates Rust code from it that can then be compiled to Wasm and deployed to a Wasm environment. I'm just looking for the simplest/most elegant solution to converting between native Rust types and Boa's JsValue types, and the simplest API I can imagine is: rust_value.into_js_value() and js_value.from_js_value(), with the conversions inferred appropriately. I want the following to work appropriately for any type:
let integer: i128 = js_value.from_js_value();
let js_value = integer.into_js_value(); // This should be a BigInt since user's will expect any i128 values to be perfectly handled, no precision loss
I'm dealing with a JS and Rust environment that will send values back and forth between the JS and Rust environment's in generalized ways. I need the conversions to be perfect and would like to write as little custom code to get the conversions to work as possible. Thus, I would like a standard correct way to convert all Rust types into JsValue, and all JsValue into Rust types, without having to write custom code at the point of conversion. This should all be inferred.
I am getting close to getting this working in my fork of boa, but I've made some decisions about conversions that might be controversial, thus I'm trying to lay that out here.
Does my reasoning make sense?
To sum up, using i128 as an example, I agree that we should allow the developer to convert to a JS Number or BigInt at will. But I believe that the happy path should create the most accurate representation of the i128 in JS as possible (no loss of precision), and thus should convert to BigInt.
let number: i128 = 0_i128;
let js_value: JsValue = number.into();
js_value above should be a boa::bigint::JsBigInt so that when I pass it into a Context it becomes a JS bigint having lost no precision.
I am more in Cavour of having a serde-like API for that, maybe with helper functions and attributes/derives, to make this easy and customizable, but I don't think precision should be the default path, and I know at least @HalidOdat thinks similarly.
Something like i64, has no overhead on a 64-bit platform, it's perfectly performant. Now, a BigInt, will mean tons of allocations, synchronization primitives, reference counting an a lot more, so I don't think that should be the default, any it should definitely not be a transparent conversion.