iris
iris copied to clipboard
Implement lazy area weights
🚀 Pull Request
Description
This PR makes iris.analysis.cartography.area_weights lazy by providing the keyword arguments compute and chunks:
def area_weights(cube, normalize=False, compute=True, chunks=None):
These defaults ensure full backwards-compatibility. chunks=None results in using the cube data's chunks for the weights array, which is a resonable default since these weights will probably be used with the cube's data anyway.
This requires https://github.com/SciTools/iris/pull/5620, so still in draft mode. Apart from that, this PR is ready.
Closes https://github.com/SciTools/iris/issues/5611
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 89.77%. Comparing base (
7c313ff) to head (e6985e3). Report is 4 commits behind head on main.
Additional details and impacted files
@@ Coverage Diff @@
## main #5658 +/- ##
=======================================
Coverage 89.76% 89.77%
=======================================
Files 93 93
Lines 22982 22995 +13
Branches 5006 5011 +5
=======================================
+ Hits 20630 20643 +13
Misses 1622 1622
Partials 730 730
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Thanks for the hard work @schlunma. #5620 will be reviewed by @pp-mo in the new year, so #5658 can get moving again once that is sorted.
Just had a second look on this. I think it's actually better to use cube.lazy_data().chunks instead of cube.lazy_data().chunksize to make sure that the chunk structure is exactly identical in the weights and the original data. It probably won't matter, but it feels better to use .chunks here...
After having a third look on it, I think it's better to leave full control of the chunks to the user (i.e., if chunks=None, also use chunks=None in the broadcast_to_shape call instead of the cube's chunks). Otherwise, there is no way a user can specify to use "no special chunking", which might be confusing.
Sorry for all the back and forth, with this I am happy with the PR!