Tom Nicholas
Tom Nicholas
This is great, thank you @ianhi ! Can you just explain how (if at all) this interacts with our CI? I see you can locally configure to automatically rebuild the...
Thank you @spencerkclark ! Clearly I had misunderstood what `num2date` does. I've got my code working now thanks to you. > It probably speaks to having examples in the documentation...
> particularly since it's a paid for option? There was some stuff at SciPy about [quansight](https://quansight.com/) being able to give out free access to NVIDIA GPUs for scientific python projects...
I was actually just using this library yesterday - there are some really very subtle differences between numpy's behaviour and the array API standard, so testing against this explicitly would...
To demonstrate my point about (2), it looks like storing a million references in-memory can be done using numpy taking up only 24MB. ```python In [1]: import numpy as np...
> Next thing we should look at is how much space these references would take up on-disk as JSON/parquet/special zarr arrays. See Ryan's comment (https://github.com/TomNicholas/VirtualiZarr/issues/33#issuecomment-1998639128) suggesting specific compressor codecs to...
> even a really big store with 100 variables, each with a million chunks I tried this out, see https://github.com/earth-mover/icechunk/issues/401
[This write-up](https://github.com/earth-mover/icechunk-nasa/blob/main/design-docs/icechunk-stores.md) of trying VirtualiZarr + Icechunk on a large dataset (IMERG) is also illuminating (see https://github.com/earth-mover/icechunk-nasa/pull/1 for context) tl;dr: icechunk needs to reduce the size / split up its...
This would also mean we had no need for a more complex entrypoint system like https://github.com/zarr-developers/VirtualiZarr/issues/245
I think explicitly creating a store seems fine