You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Multimodal exploration of spatial data with side-by-side embeddings
Rationale:
This is foundational work to support data consumers’ ability to generate hypotheses from spatial data. Being able to compare the location of cells in physical space with their position in UMAP or other lower-dimensional “transcriptomic space” enables data consumers to investigate how gene expression profiles change in different cell types and tissue locations.
User stories:
As a user, I want to directly compare cells’ placement in physical space (spatial embedding) with its representation in a low-dimensional embedding (e.g., UMAP or tSNE) so that I can better understand how patterns in gene expression vary across space.
Requirements:
We can show two scatterplots side by side; one or both of these may be a spatial embedding with an image underlay.
It’s very important to keep the state of the data consistent across the entire viewer – that is, the filters, coloring, and which points are in view should be internally consistent with each other.
Hovering over cells in the left sidebar should highlight the same cells in both embeddings.
Hovering over cells in one embedding should highlight the same cells in the other.
Subsetting affects both embeddings. E.g., if the lasso selector is used to subset, the set of cells is displayed in the other embedding (even though they are not contiguous).
@seve I don't remember if the data point limit we currently support is memory bound or CPU/GPU bound, but maybe worth creating a ticket to double check that the side by side mode will work with the max data points we support 😊?
Thank you!
seve
transferred this issue from chanzuckerberg/single-cell-data-portal
May 10, 2024
Multimodal exploration of spatial data with side-by-side embeddings
Rationale:
This is foundational work to support data consumers’ ability to generate hypotheses from spatial data. Being able to compare the location of cells in physical space with their position in UMAP or other lower-dimensional “transcriptomic space” enables data consumers to investigate how gene expression profiles change in different cell types and tissue locations.
User stories:
As a user, I want to directly compare cells’ placement in physical space (spatial embedding) with its representation in a low-dimensional embedding (e.g., UMAP or tSNE) so that I can better understand how patterns in gene expression vary across space.
Requirements:
Design - https://www.figma.com/file/eJ3acPxG04PKVlom6eS8ft/Spatial?type=design&node-id=304%3A195512&mode=dev
@hthomas-czi
Sub-Tasks:
The text was updated successfully, but these errors were encountered: