uncertainties icon indicating copy to clipboard operation
uncertainties copied to clipboard

unumpy vectorized functions extremely slow

Open beojan opened this issue 8 years ago • 6 comments

The unumpy vectorized functions are extremely slow. This seems to be because the uncertainty propagation seems to be repeated for every element rather than simply being done once.

beojan avatar Nov 30 '17 15:11 beojan

@beojan There is some related discussion in https://github.com/lebigot/uncertainties/issues/57, doesn't look like there is an easy solution for it unfortunately.

rth avatar Nov 30 '17 15:11 rth

@beojan, most uncertainty calculations done by uncertainties.unumpy are indeed repeated for every element in turn. Now, in principle it would be possible to do better. I'm not expecting much speedup in general, though. One thing that would be useful would be to profile some calculation and see whether the NumPy vectorization is responsible for most of the calculation time. So @beojan if you have a minimal working example of a calculation that you had in mind, that'd be useful!

lebigot avatar Oct 07 '18 12:10 lebigot

If you see the notes here, you'll see that the NumPy vectorize function basically generates a for loop (which would presumably cross the Python - C boundary on every iteration to call the function). If that's what's being used, it's going to be very slow indeed.

The only solution I can think of is to have a uarray type that internally stores the value and uncertainty separately, and manipulate these with existing numpy or scipy functions.

beojan avatar Oct 08 '18 13:10 beojan

Exactly. Pull requests are welcome as usual!

lebigot avatar Oct 08 '18 13:10 lebigot

https://numpy.org/neps/nep-0018-array-function-protocol.html might be relevant.

lebigot avatar Aug 04 '19 17:08 lebigot

Could https://github.com/sradc/SmallPebble be used as an efficient backend for uncertainties?

lebigot avatar Jan 26 '23 15:01 lebigot