Results 1078 comments of Tom Augspurger

Note that after computing, the dtypes are correct: ```python In [6]: ddf["c"].str.split(",", n=1, expand=True).compute().dtypes Out[6]: 0 string[pyarrow] 1 string[pyarrow] dtype: object ``` so it's just the `_meta` after `.str.split`. https://github.com/dask/dask/blob/0fa5e18d511c49f1a9cd5f98c675a9f6cd2fc02f/dask/dataframe/dask_expr/_str_accessor.py#L186-L189...

Thanks Matt. > Since this is a breaking change, we should probably put a pretty big note in the changelog then? @ilan-gold can you clarify what you see as the...

Thanks, that matches my understanding / assumption: that name / token was being passed around incorrectly in those spots you had to change. I'll see if I can confirm that...

https://github.com/dask/dask/pull/3597 and https://github.com/dask/dask/pull/8464 seem like the relevant history here. It seems like the motivation was fixing an inconsistency in `name` and `token` in `map_blocks` with the rest of `dask` /...

> Are they compatible? I suppose if they are this warning shouldn't be printed and if they are not the warning should print full types instead of just string. `string[pyarrrow]`...

I might be confusing myself, but I think this implementation might not be what we want... I think what users want (like us in https://github.com/zarr-developers/zarr-python/pull/2400) is the size of an...

Thanks. Looking at how Icechunk would implement `getsize` is what prompted my question so I can se how a `getsize_dir` makes sense there. Would you expect the size of metadata...

I've pushed an update that adds a `getsize_prefix`, but am having second thoughts about whether this is worth adding to the API. It's not clear to me that a Store...

Should be all set now.

@MJLagerwerf are you able to reproduce this with just zarr and numpy (no dask)? If so, you could consider reporting this over at https://github.com/zarr-developers/zarr-python, assuming it's something that was supported...