KahanSummation.jl
KahanSummation.jl copied to clipboard
additional functionality
Part of my reason for #7 was to make it easier to extend. Some other functions we should provide:
- [ ]
dot_kbn: compute the dot-products using double-double, then sum up using Kahan summation - [ ]
mean_kbn: divide the extended-precision sum using a 2-div style, so thatmean_kbn(fill(0.1,10)) == 0.1
I'm interested in #7, because it allows simple parallelization with something like
psum_kbn(f, X) = singleprec(Folds.mapreduce(f, InitialValues.asmonoid(plus_kbn), X))
psum_kbn(X) = psum_kbn(identity, X)
Is there anything wrong with the PR besides the fact that singleprec(x::TwicePrecisionN{T}) where {T} = x.hi - x.nlo is better suited for AD?