Ivan Ruiz Manuel
Ivan Ruiz Manuel
In that case, I would avoid doing this unless we guarantee data sparsity through [sparse](https://sparse.pydata.org/en/stable/) (or something similar), because each of those `nan` values will take 64 bits (the same...
Hmm... You are right. We should go with the solution that is most understandable to users... I'd say that is loading _and_ resampling data on the same step, to avoid...
I like the second option the most. It keeps everything neatly in one place, which is generally better for maintainability... The third one could confuse users in my opinion. Regarding...
Hmm... I think having `parameters` and `dims` at `init` is a base requirement of this change anyhow... Regarding the mode, perhaps it is not as big of a problem if...
@brynpickering I think this makes sense, for the most part. However, I see some possible dangers... When jumping between init and build, the math and model size is left 'ambiguous',...
@brynpickering looking at the code, it seems they achieve very similar things, but I do not think they match `template`'s purpose, since `template` is more of a 'user-side' thing, and...
After digging around, I confirmed that the issue is an unpinned `numpy` version. Pinning `numpy = 1.23` will solve the issue.
@fneum did the test, `int16` fixed it!
Here is the cutout I am using for this. [cutout_csp.zip](https://github.com/user-attachments/files/16445110/cutout_csp.zip)