spatialdata icon indicating copy to clipboard operation
spatialdata copied to clipboard

reimplemented incremental io

Open LucaMarconato opened this issue 11 months ago • 11 comments

Closes #186 Closes #496 Closes #498

Support for incremental io operations.

New features:

  • [x] ability to save additional elements to disk after the SpatialData object is created
  • [x] ability to remove from disk previously saved objects
  • [x] ability to see which elements are present only in-memory and not in the Zarr store and viceversa
  • [x] refactored saving of metadata:
    • [x] transformations
    • [x] consolidated metadata
    • [x] set the basis (but not implemented), like empty tests or TODOs with what's missing, for the other metadata: table.uns['spatialdata_attrs'], points.attrs['spatialdata_attrs'] and OMERO metadata for image channel names

Robustness:

  • [x] refactored write function to make it more robust
  • [x] improved error messages for the users, with actionable advices
  • [x] new concept of "self-contained" SpatialData object and "self-contained" elements. Useful for the user to understand the implications of file backing
  • [x] added info on Dask-backed files for non "self-contained" elements to __repr__()

Testing:

  • [x] improved existing tests for io
  • [x] extensive testing for modular io
  • [x] improved testing for comparision of metadata after io and after a deepcopy

Other:

  • [x] fixed bug of points columns being shuffled after a query operation #486

This PR also sets the basis for (not implemented here) the ability to load in-memory objects that are Dask-backed.

LucaMarconato avatar Mar 21 '24 22:03 LucaMarconato

Codecov Report

Attention: Patch coverage is 86.05263% with 53 lines in your changes missing coverage. Please review.

Project coverage is 92.12%. Comparing base (62a6440) to head (2585216). Report is 79 commits behind head on main.

Files with missing lines Patch % Lines
src/spatialdata/_core/spatialdata.py 87.62% 38 Missing :warning:
src/spatialdata/transformations/operations.py 53.33% 7 Missing :warning:
src/spatialdata/_io/_utils.py 75.00% 6 Missing :warning:
src/spatialdata/models/_utils.py 75.00% 1 Missing :warning:
src/spatialdata/testing.py 87.50% 1 Missing :warning:
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #501      +/-   ##
==========================================
- Coverage   92.56%   92.12%   -0.44%     
==========================================
  Files          42       42              
  Lines        6078     6429     +351     
==========================================
+ Hits         5626     5923     +297     
- Misses        452      506      +54     
Files with missing lines Coverage Δ
src/spatialdata/_core/_deepcopy.py 98.41% <100.00%> (+0.02%) :arrow_up:
src/spatialdata/_core/_elements.py 92.47% <100.00%> (+0.80%) :arrow_up:
src/spatialdata/_io/io_zarr.py 88.37% <100.00%> (ø)
src/spatialdata/dataloader/datasets.py 90.68% <ø> (ø)
src/spatialdata/models/models.py 87.69% <100.00%> (+0.16%) :arrow_up:
src/spatialdata/models/_utils.py 91.79% <75.00%> (+0.18%) :arrow_up:
src/spatialdata/testing.py 98.24% <87.50%> (-1.76%) :arrow_down:
src/spatialdata/_io/_utils.py 88.52% <75.00%> (-2.33%) :arrow_down:
src/spatialdata/transformations/operations.py 89.94% <53.33%> (-2.75%) :arrow_down:
src/spatialdata/_core/spatialdata.py 90.78% <87.62%> (-1.44%) :arrow_down:

... and 6 files with indirect coverage changes

codecov[bot] avatar Mar 22 '24 01:03 codecov[bot]

@ArneDefauw @aeisenbarth tagging you because you opened at some point a issue regarding incremental IO. In this PR incremental IO is implemented, happy to receive feedback in case you want to play around with it😊

I will make a notebook to showcase it, but in short to save an element lables, table, etc you can use the new sdata.write_element('element_name'). If the element already exists in the storage an exception will be raised. You can work around the exception for instance with the strategies shown here: https://github.com/scverse/spatialdata/blob/87dd1a88b41139d2657b56a5191a14e239aa26a2/tests/io/test_readwrite.py#L159

Please note that those strategies are not guarantee to work in various scenarios, including multi-threaded application, network storages, etc. So please use with care.

LucaMarconato avatar Mar 23 '24 18:03 LucaMarconato

Currently the whole table needs to be replaced and the whole table needs to be stored in-memory, but recent progress in anndata + dask will be used also in spatialdata to allow lazy loading and the replacement of particular elements (like adding a single obs column). This PR clean up the previous code and is a step in that direction.

LucaMarconato avatar Mar 23 '24 18:03 LucaMarconato

Thanks @LucaMarconato ! I left a review with some minor points below. I think it looks good, but i didn't have time for a super in depth review. Given that this is a big change, I think the approval should be given by somebody who can look more closely.

Thanks for the review @kevinyamauchi. I will have a look at this PR later today as well.

melonora avatar Mar 25 '24 11:03 melonora

@ArneDefauw @aeisenbarth tagging you because you opened at some point a issue regarding incremental IO. In this PR incremental IO is implemented, happy to receive feedback in case you want to play around with it😊

I will make a notebook to showcase it, but in short to save an element lables, table, etc you can use the new sdata.write_element('element_name'). If the element already exists in the storage an exception will be raised. You can work around the exception for instance with the strategies shown here:

https://github.com/scverse/spatialdata/blob/87dd1a88b41139d2657b56a5191a14e239aa26a2/tests/io/test_readwrite.py#L159

Please note that those strategies are not guarantee to work in various scenarios, including multi-threaded application, network storages, etc. So please use with care.

Thanks for the quick response and fix!

I've tested the incremental io for my use case, and up to now everything seems to works as expected, except one thing. If I follow the approach suggested here:

https://github.com/scverse/spatialdata/blob/87dd1a88b41139d2657b56a5191a14e239aa26a2/tests/io/test_readwrite.py#L159

I get a ValueError when I load my SpatialData object back from the zarr store and try to overwrite it: ValueError: The file path specified is a parent directory of one or more files used for backing for one or more elements in the SpatialData object. Deleting the data would corrupt the SpatialData object.

The fix was to first delete the attribute from the SpatialData object, and then remove the element on disk. Minimal example below of a typical workflow in my image processing pipelines:

from spatialdata import SpatialData
from spatialdata import read_zarr
import spatialdata
import dask.array as da

img_layer="test_image"

sdata=SpatialData()

sdata.write( os.path.join( path, "sdata.zarr" ) )

dummy_array = da.random.random(size=(1,10000, 10000), chunks=(1,1000, 1000))

se=spatialdata.models.Image2DModel.parse(
                data=dummy_array,
            )

sdata.images[ img_layer ] = se

if sdata.is_backed():
    sdata.write_element("test_image", overwrite=True)

# need to read back from zarr store, otherwise graph in the in-memory sdata would not be executed
sdata=read_zarr( sdata.path )

# now overwrite:

# Here I needed to first delete the attribute:

# first delete attribute
element_type = sdata._element_type_from_element_name(  img_layer )
del getattr(sdata, element_type)[ img_layer ]
# then on disk
if sdata.is_backed():
    sdata.delete_element_from_disk( img_layer )


sdata.images[ img_layer ] = se

if sdata.is_backed():
    sdata.write_element(img_layer, overwrite=True)

sdata=read_zarr( sdata.path )

I think what the unit test you refered to lacks is the reading back from the zarr store, after an element is written to a zarr store.

In version 0.0.15 of SpatialData, when sdata.add_image(...) was executed, it was not necessary to read back from the zarr store. I understand that current implementation allows for more control, but the inplace update of the SpatialData object was kinda convenient.

Edit: I added a pull request, to illustrate the issue a little bit more: https://github.com/scverse/spatialdata/pull/515

ArneDefauw avatar Mar 25 '24 15:03 ArneDefauw

Thank you @ArneDefauw for trying the code and for the explanation, I will now look into your PR.

In version 0.0.15 of SpatialData, when sdata.add_image(...) was executed, it was not necessary to read back from the zarr store. I understand that current implementation allows for more control, but the inplace update of the SpatialData object was kinda convenient.

The reason why we refactored this part is because with add_image(), if the user had a in-memory image and was writing it to disk, the image would have then immediately lazy loaded. This is good and ergonomic if the image needs to be written only once, but if the user tries to write image again (for instance in a notebook when a cell may get manually executed twice), then it would have lead to an error.

LucaMarconato avatar Mar 27 '24 20:03 LucaMarconato

Thanks for the reviews. I addressed the points from @kevinyamauchi and from @ArneDefauw (in particular I merged his PR here). @giovp, when you have time could you also please give a pass to this?

LucaMarconato avatar Mar 27 '24 23:03 LucaMarconato

Thank you @ArneDefauw for trying the code and for the explanation, I will now look into your PR.

In version 0.0.15 of SpatialData, when sdata.add_image(...) was executed, it was not necessary to read back from the zarr store. I understand that current implementation allows for more control, but the inplace update of the SpatialData object was kinda convenient.

The reason why we refactored this part is because with add_image(), if the user had a in-memory image and was writing it to disk, the image would have then immediately lazy loaded. This is good and ergonomic if the image needs to be written only once, but if the user tries to write image again (for instance in a notebook when a cell may get manually executed twice), then it would have lead to an error.

Hi @LucaMarconato , thanks for the reply and the fixes! I've tested your suggestions (https://github.com/scverse/spatialdata/blob/582622f689a7e05421e9d066f98baf702549978f/tests/io/test_readwrite.py#L136) for my use cases and everything seems to work fine (both for images,labels, points, labels and shapes ).

Workaround 1 , looks rather safe in most scenarios. If I understand correctly it covers following scenario: Having "x" in sdata, doing something on "x" (i.e. defining a dask graph), and then writing to "x".

How I would usually work is: Having "x" in sdata, doing something on "x", and writing to "y" (where "y" already exists).

The latter feels less dangerous, and looks pretty standard in image processing pipelines, e.g. tuning of hyper parameters for image cleaning or segmentation.

In the latter case, I guess the following would be sufficient:


arr=sdata["x"].data
arr=arr*2
spatial_element=spatialdata.models.Image2DModel.parse(
                arr,
            )
del sdata["y"]
sdata.delete_element_from_disk("y")
sdata["y"]=spatial_element
sdata.write_element("y")
sdata=read_zarr( sdata.path )

ArneDefauw avatar Mar 29 '24 13:03 ArneDefauw

Yes, I agree that the approach that you described is generally a good practice when processing data and safe, since the original data is not modified.

The use cases that I described are instead for the case in which the data itself is replaced. I think I should add in the comments that this approach should be avoided when possible, and clarify that the workaround that I described are only if really needed.

LucaMarconato avatar Mar 30 '24 02:03 LucaMarconato

Thank you for this PR, I am using it right now. One question: would it be possible to pass a list of strings onto write_element() instead of just one element? @LucaMarconato

namsaraeva avatar Apr 08 '24 12:04 namsaraeva

@namsaraeva thanks for the suggestion, it's indeed handier to have list of names. I have added the support for this for write_element() and delete_element_from_disk().

LucaMarconato avatar Apr 08 '24 19:04 LucaMarconato

personally I don't see any blockers currently for this PR.

melonora avatar May 15 '24 08:05 melonora

personally I don't see any blockers currently for this PR.

@melonora I wanted to check this PR before merging: https://github.com/scverse/spatialdata/pull/525/files. I will do this this weekend.

LucaMarconato avatar Jun 08 '24 00:06 LucaMarconato