scanpy
scanpy copied to clipboard
add support for Visium HD Spatial Gene Expression data
What kind of feature would you like to request?
Additional function parameters / changed functionality / changed defaults?
Please describe your wishes
10X updated to Space Ranger v3.0, and the output files differ from the v2.0 version. Scanpy is unable to read the output files from v3.0. Can you add support for Space Ranger v3.0?
@giovp @LucaMarconato, could you advise here?
Yes, this is a suggestion, and I hope you will consider it.
Hi, we implemented a reader for Visium HD data in spatialdata-io
; an example notebook showing its usage can be found here: https://github.com/scverse/spatialdata-notebooks/blob/main/notebooks/examples/technology_visium_hd.ipynb.
You can use scanpy
directly on the AnnData
objects that are parsed.
For instance, in the last part of the notebook where we download the cluster information, you could instead use scanpy
to preprocess/qc the data and then identify the clusters.
Hope this helps 😊
Thank you very much. This solution perfectly solved the problem.
Has the format changed for all assays, or is this specific to visium HD?
And is there significant analysis that can be done on these without taking into account extra spatial information?
I would be inclined to say that the visium IO functions in scanpy should be deprecated/ replaced with a light wrapper around: spatialdata_io.experimental.to_legacy_anndata(spatialdata_io.visium(*args, **kwargs))
or just accessing the AnnData from spatialdata_io.visium_hd
.
But if there is significant burden involved in the dependencies of spatialdata
, then maybe we could implement something lighter here for now.
Has the format changed for all assays, or is this specific to visium HD?
Do you refer to the support for multiple annotation tables or also to other parts of the specs? Visium HD and MCMICRO are currently the only technologies making use of multiple tables.
just accessing the AnnData from spatialdata_io.visium_hd
I would favor this approach. I would call to_legacy_anndata()
only if it is essential to have the spatial information is obsm.
Hi, we implemented a reader for Visium HD data in
spatialdata-io
; an example notebook showing its usage can be found here: https://github.com/scverse/spatialdata-notebooks/blob/visium_hd/notebooks/examples/technology_visium_hd.ipynb.You can use
scanpy
directly on theAnnData
objects that are parsed. For instance, in the last part of the notebook where we download the cluster information, you could instead usescanpy
to preprocess/qc the data and then identify the clusters.Hope this helps 😊
Hi, the link seems to be invalid. Is there any alternative links?
Hi, we implemented a reader for Visium HD data in
spatialdata-io
; an example notebook showing its usage can be found here: https://github.com/scverse/spatialdata-notebooks/blob/visium_hd/notebooks/examples/technology_visium_hd.ipynb. You can usescanpy
directly on theAnnData
objects that are parsed. For instance, in the last part of the notebook where we download the cluster information, you could instead usescanpy
to preprocess/qc the data and then identify the clusters. Hope this helps 😊Hi, the link seems to be invalid. Is there any alternative links?
Would you please try this link? https://github.com/scverse/spatialdata-notebooks/blob/main/notebooks/examples/technology_visium_hd.ipynb
Hi, we implemented a reader for Visium HD data in
spatialdata-io
; an example notebook showing its usage can be found here: https://github.com/scverse/spatialdata-notebooks/blob/visium_hd/notebooks/examples/technology_visium_hd.ipynb. You can usescanpy
directly on theAnnData
objects that are parsed. For instance, in the last part of the notebook where we download the cluster information, you could instead usescanpy
to preprocess/qc the data and then identify the clusters. Hope this helps 😊Hi, the link seems to be invalid. Is there any alternative links?
Would you please try this link? https://github.com/scverse/spatialdata-notebooks/blob/main/notebooks/examples/technology_visium_hd.ipynb
Great help. Thanks!