cellxgene
cellxgene copied to clipboard
support viz of >2D embeddings
if the selected embedding has >2 dimensions, expose the third dimension in some way
brainstorm of ideas:
- let user select which dimensions are plotted on which axes
- full 3d manipulation
- 2d manipulation with 3rd dimension as an annotation
- 2d manipulation with 3rd dim indicated in another way (size, saturation, slight motion)
the "slight motion" one is hard to explain... an application I used to use in grad school did this for 3d data on a 2d canvas. http://ced.co.uk/products/spksscl
imagine the 3rd dimension is in the z-direction with negative values "behind" the screen and positive values in front. now, imagine taking the z-axis and rotating it slightly in a circle around the axis coming out from the screen such that points in positive z rotate clockwise and points in negative z rotate counter clockwise, with a larger radius as the magnitude of z increases. it gives a pretty profound sense of depth to the data (if I can find a demo, I will)
notes from convo with @colinmegill:
How does selection work?
- spherical selection (e.g. VR)
- what does plotly do?
Is 3D the best way to elucidate finer structure? Or is reclustering on subsets more useful?
Is this a dataset-specific feature? (e.g. spatial or timecourse)
We may need to consider this in light of #594 ... what do we do if there is a 3D layout we read in?
No users have really expressed a need for this, so it's low priority unless something changes that we need to reconsider it.
I would like to express a need for this! I am unlikely to use cellxgene myself without 3D support. Still might give it to our less computationally-inclined collaborators.
Selecting cells in 3D is challenging, and I'm definitely not doing VR. I would be happy to discuss potential solutions for this, but you could start by letting users draw a preliminary lasso based on their current view, then project this into a sphere and let the user refine the sphere boundaries.
Maybe not realistic at this stage in the project, but vispy (http://vispy.org/) has nice 3d marker plots with arcball navigation.
Closing as a low priority, if we hear more asks for this we will reconsider. Thank you!
This can be very useful! It's becoming increasingly popular to see 3D UMAPs! Cao et al 2020 's Figure 1 used it. Discussions about 3D embeddings had been there as well: https://github.com/satijalab/seurat/issues/1178 https://github.com/theislab/scanpy/issues/677 https://www.biostars.org/p/269696/
And this package seems to have moved a little further.
The benefit of 3D UMAP is that we maximize what our brain can understand intuitively so that more structures can be perceived at the same time.
Actually I just realized that plotly can already visualize 3D scatterplots nicely. But it can't conveniently show gene expressions easily.
@brianpenghe I'd be excited to see someone hack this in on a fork and experiment! We're already in webgl land, so it's possible in the present architecture. Selections in 3d are an interesting problem, and I will link back to @neuromusic's comment above re: spherical selection, it's an idea from a VR interface called Chroma https://store.steampowered.com/app/587470/Chroma_Lab/ and h/t @spiraloid for putting me inside a headset back when this came out in 2017
re: 3d selections, I think there's a hidden step between mouse and VR.
< $100 https://www.ultraleap.com/product/leap-motion-controller/
@brianpenghe I'd be excited to see someone hack this in on a fork and experiment! We're already in webgl land, so it's possible in the present architecture. Selections in 3d are an interesting problem, and I will link back to @neuromusic's comment above re: spherical selection, it's an idea from a VR interface called Chroma https://store.steampowered.com/app/587470/Chroma_Lab/ and h/t @spiraloid for putting me inside a headset back when this came out in 2017
very good thought! And the goal is understandably ambitious.
Just for eye candy I'm attaching my fetal lung atlas data in 3D space here generated by plotly.
Can I bump this issue again? I'd love 3d cellxgene support. Still unlikely to use the tool without it. For selection -- I'm okay without it!
I'm also happy to discuss possible solutions (can you translate a view into a transformation of basis vectors, then select cells iteratively)?