Importing ROIs and transformations from QuPath
To align an image to a reference atlas I used ABBA to apply some transformations via elastix and bigwarp. I then exported the registrations to QuPath where I can see the atlas segmentations on the original image. I also see that it generates a json file that describes the transformations that were applied to my image. (Example below) In theory I think I should should be able to use this to create an "atlas aligned" coordinate system in spatial data? And I could also import the ROIs as shapes?
"realTransform_2": {
"type": "AffineTransform3D",
"affinetransform3d": [
999.9999999999999,
0.0,
0.0,
405.0,
0.0,
999.9999999999999,
0.0,
421.99999999999994,
0.0,
0.0,
999.9999999999999,
0.49999999999999994
]
OK I see that I can export ROIs from QuPath as GeoJson and load them with ShapesModel.parse(). So I guess this question is more about the transformations
Hi, for parsing it depends on the format and order of axes that QuPath uses. Looking at the JSON representation above it seems that the order of axes is x, y, z, and so that you would need this matrix here:
Here is an example of how we parse transformations from Xenium data (which uses a plain-text dump of the matrix, so different from here). From this it should be easy to adapt the code to your use case. https://github.com/scverse/spatialdata-io/blob/90a5de37a440fa8ea0e5ea82559f0279f9bb0743/src/spatialdata_io/readers/xenium.py#L665
Please let me know if it works for you, and in such case, please consider making a Pull Request to spatialdata-io by providing a general experimental parser for QuPath transformations.