proposal-decimal
proposal-decimal copied to clipboard
BigInt + Scale
The most simple implementation of BigDecimal would be a record with two fields: a BigInt and a scale (power of 10). The industry already uses something along these lines for currency amounts, which are often stored as a BigInt number of "micros".
If we're only concerned about representation, and not math, an alternative to the BigDecimal proposal altogether would be to establish a precedent for how to represent the BitInt/Scale pair. For example, we could add a .format overload to Intl.NumberFormat method that takes an object literal with a BigInt and a scale. Something like,
new Intl.NumberFormat().format({
mantissa: // a BigInt
scale: // a power of 10 (probably a Number)
});
We could establish that convention as how BigDecimals are represented in the language, without actually introducing a BitInt type.
The division in that case becomes very hard since you need to cast BigInt to a Number and sometimes that's not even possible.
The division in that case becomes very hard since you need to cast BigInt to a Number and sometimes that's not even possible.
That's what I meant when I said, "if we're only concerned about representation, and not math." What I proposed is an alternative if the key use case is respecting decimal numbers accurately. If the key use case is math, then we may want to design a different representation.
How would you represent one third?
How would you represent one third?
You wouldn't. If such a feature (rational numbers) is desired, my suggestion is not the right approach.
Note that Java BigDecimal is essentially the BigInt+Scale pair. They don't support rational numbers.
What happens if i do 1d / 3d then?
@ljharb as far as I understand, @sffc don't propose to make BigDecimals a mantissa + scale pair. Instead, an overloaded form of Intl.NumberFormat.prototype.format is used just for representation.
I like the idea of picking this up if we decide not to go forward with BigDecimal, but I think much of the benefit of BigDecimal will be having a single value which logically represents the pair, together with a library of functions to operate on them mathematically in addition to string conversion.
@chicoxyzzy i was responding to “The most simple implementation of BigDecimal would be a record with two fields: a BigInt and a scale“ which would need a way to represent values with repeating decimals places, and I’m not sure how that record would be able to do it.
@ljharb It sounds like you were responding based on a misunderstanding of the initial comment. To make this thread easier to read for others, mind if I hide all of those responses?
How was it a misunderstanding? The OP has two parts; I’m responding to the first part, which is not intl specific.
@ljharb I read that as setting the stage for how you'd make an API based on those two parameters; I don't think @sffc was making two distinct suggestions in one post.
My OP was one suggestion, to establish the convention of { mantissa, scale } to represent decimal numbers across APIs without adding a separate BigDecimal type. Intl was an example of how that could work.
My comments are either pointing out (or asking what I'm missing) how that convention is not viable because it can't represent everything that Decimal would need, like 0.3̅ (or even, anything close to it). It seems to me that mantissa + scale is simply insufficient to represent all decimal numbers - i'm not talking about Rational at all.
IMO, "is not viable" is too strong a statement at this stage, because we have not fully agreed on the goals. I would argue that representing repeated digits may not be necessary for core use cases, like representing an exact amount of money. I also cited java.math.BigDecimal as prior art that does not support repeated digits.
Thanks, that response helps clarify my understanding :-)
Another API that comes to mind is to redefine what it means when a string is passed into Intl.NumberFormat.prototype.format. @sffc previously proposed that we interpret strings according to either BigInt or Number, depending which would be more accurate.
What if we just always interpreted strings as decimals and rendered them precisely, regardless of whether/when we add BigDecimal? There's some chance of compatibility impact--I'd guess the risk is a bit low, but I could be wrong. Since we already decided to overload format in a way that separates Number and BigInt, maybe it's not so bad to add String as a separate overload.