bigdecimal
bigdecimal copied to clipboard
Exponential growth in precision causing memory usage to spike as of 3.1.2
I upgraded to BigDecimal 3.1.2 to pull in the fix to #220, which was causing crashes in our Rails app. However, I believe the fix applied in 127a1b5 created the potential for exponential growth in the number of digits a BigDecimal can have. The result is that certain calculations can eat up all available memory (several gigs) and crash the process.
Take the following as an example, which outputs the number of digits from the calculation:
Array.new(10) { BigDecimal("1.03") }.inject(1.0) { |result, num| result / num }.to_digits.length
In 3.1.1, this calculation produces 119 digits.
In 3.1.2, this calculation produces 23,033 digits.
Moreover, if Array.new(10)
is grown to Array.new(20)
, 3.1.1 produces 209 digits, whereas 3.1.2 produces 23,592,953.
The runaway growth in digits and resulting memory spike could be solved by rounding the result in inject's block like so:
Array.new(10) { BigDecimal("1.03") }.inject(1.0) { |result, num| (result / num).round(3) }.to_digits.length
But others may run into this same surprise. We've also run into PG::NumericValueOutOfRange
exceptions from postgres since upgrading to 3.1.2, and my hunch is the same root issue is at play.
Let me know if I can provide any more details.
@smidwap Thank you for reporting this issue. I'll investigate the details.
We are running into this issue; would it be possible to merge and release this fix?
@Confusion I'm working on this issue in #240. But the work is temporarily paused now because I'm too busy. Could you please use 3.1.1 until it's fixed? I'm sorry for the inconvenience.
Downgrading to 3.1.1 will re-introduce the segfaults from https://github.com/ruby/bigdecimal/issues/220 (which in my opinion is less preferred than memory spikes). We'll try to downgrade to 3.1.0 until this is fixed.
Hi @mrkn - have you made any further progress in #240? This issue and https://github.com/ruby/bigdecimal/issues/271 seem quite serious bugs - especially given Ruby 3.2.2 and 3.3.0 both include an affected version as a default gem.
I don't usually +1 but any chance there's a fix coming for this? We've had to avoid all future patches until this is fixed and would love to keep up
I have some ideas for workarounds that I haven't been able to test yet:
-
instrument your calculations to report the precisions of the bigdecimal values that are used when using bigdecimal 3.1.1. Then set
BigDecimal.limit
to the highest precision you find. The calculation should then give the same result in 3.1.2+ without blowing the memory? -
determine a suitable value for
BigDecimal.limit
by trial-and-error: if you set it to some value and your results don't change, you're good? -
if you have a very good understanding of exactly what your calculations do, you could calculate the maximum precision you should need to prevent rounding errors from influencing your final result