Christopher Dupuis
Christopher Dupuis
Just learned that xarray rolling adds "_input" (or something similar) also, and it's used to distinguish between the original dimensions (which may still exist) and the new stencil dims. I'm...
Just learned about `itertools.compress`, that might be what is needed here.
While experimenting with some xbatcher ideas, I came up with a function that basically implements a predicate as would be desirable here. Here is a minimal example of what this...
Better code sample, which wraps xbatcher and also offers fixed batch sizes: ```python import xarray as xr import xbatcher as xb import numpy as np import random da1 = xr.DataArray(np.random.randint(0,9,(400,400)),...
Alternatively, is it possible in this scenario to "rechunk" along the sample dimension (so you'd get like 32 x lon x lat)?
So I figured out I can use `concat_input_dims=True` to get to a better state, but one giant batch is also not ideal if we're trying to parallelize stuff in the...
Yeah this is connected to some other general weirdness about the number of input vs. concat dims. I'll try adding dummy dimensions again and see what I get (but it...
Also wanted to note that this issue turns xbatcher into a massive memory hog, and it's probably related to #37 as well.
Why is there a deep copy [here](https://github.com/xarray-contrib/xbatcher/blob/673e3caefb4a23102bfc8e2292f468f735f0189c/xbatcher/generators.py#L314)?
Along the lines of https://github.com/xarray-contrib/xbatcher/issues/162#issuecomment-1431902345, we can create fixed-size batches for the case of all dims being input dims by using a `BatchGenerator` wrapper with the following structure: ```python import...